Wow. Just take a step back and look at the question I prosed: Who is liable in a L3 autonomous vehicle accident. I know the answer to this question, but I don't think you do. I'm done trying to have a conversation about why L3 is not certified, because you are unwilling to accept the answer. This is not a debate, it's a fact that Tesla is trying to receive certification despite their system not being ready for the legal responsibilities required for L3 driving to be certified.
I've known Tesla's programming lead for Autonomous Driving for 15 years and my brother is a developmental engineer for Rivian. Might want to take a step back and be willing to listen, you might learn something.
Where did you pose a question? I don't see any sentences ending in a question mark, which is the very definition of posing a question when using the written form of the English language and proper grammar.
That said, I think we might actually be saying the same thing, just differently. I agree that when the Conditional Autonomous L3 system is in use, the human is not liable. Since NHTSA has in point of fact endorsed the SAE J3016 protocol, as has already been clearly evidenced in prior posts, we can also agree that when the Conditional Autonomous L3 system hands control back to the human driver, or when the human driver is driving the vehicle and
not using the L3 system, the human driver is liable. L3 is basically a bridge between L2 ADAS systems and L4 ADS systems, hence the use of the word Conditional in all of the relevant standards documentation (including the NHTSA documentation you provided). The human can still drive the vehicle and is liable when driving any vehicle that is equipped with a L3 CADS. J3016 doesn't have anything to do with liability - it has to to with technically defining the levels of driving automation.
If we go back to your first post - you're already agreeing with me:
https://tugbbs.com/forums/threads/youre-being-lied-to-about-electric-cars.351199/post-3219686
In regards to liability NHTSA says: When the AV systems are monitoring the roadway, the surrounding environment, and executing driving tasks (autonomy Levels 3 through 5), the vehicle itself should be classified as the driver.
Bolded emphasis mine. So,
when the CADS is in use, the manufacturer is liable,
when the human is driving, the human is liable, and since it's a CADS and not a full ADS, there are
conditions under which the CADS cannot function that require the human driver to take over - as is clearly outlined on both the NHTSA website and the SAE J3016 standard.
Let's ask Grok for advisory on this topic:
https://x.com/i/grok/share/4BcTjHSCnv04ReyVCybnPP8YP
According to Grok at least, NHTSA doesn't actually have any written federal standard for L3 yet - so it's likely going to fall to state tort law as a result. Grok could be wrong though, but this is likely why Tesla and other AV manufacturers are pushing for a federal standard, including liabilities, for CADS/ADS vehicles - because meeting 50 different standards at the state level will be onerous. Not impossible, but certainly much more difficult.
Are you saying you know Ashok Elluswamy personally? Because he is the VP of Autopilot/FSD at Tesla and is still the programming lead. Or are you saying you know someone who works for Ashok? Because that's who I know, one of the programmers on Ashok's team. Lastly, do you own a Tesla? Do you use FSD every day? Are you speaking from real world experience in other words? Curious. We've been Tesla investors since 2018 and owners since 2023 and are a part of the FSD early adopter program (have been for a while now). I'm no expert on liability as it relates to insurance/tort law, but if there is one thing I know, the market will adapt as needed, and it will do so faster than many think, as is always the case as history teaches us. So whatever hurdles exist, they will be surmounted with relative haste. Probably not as fast as Tesla/Musk would like, but fast enough, certainly not the ten year timeline that
@easyrider seems to think will be the case.