Electric Vehicles

I work on Tesla’s Autopilot team. I watch hours of customers’ driving videos every day and am monitored constantly.


I work on Tesla’s Autopilot team. I watch hours of customers’ driving videos every day and am monitored constantly.

by inuni1

13 Comments

  1. OverlyOptimisticNerd

    > When we had concerns they were often brushed off. There were some times we were told to ignore “No Turn on Red” or “No U-Turn” signs. Those were the kind of things that made me and my coworkers uncomfortable. In some cases, they would hear us out, but other times the general response was along the lines of “Mind your business and your pay grade.”

    This doesn’t surprise me. You see FSD making the same mistakes and, as usual, it’s middle-management thinking they know more than the people working on the project itself. Every time, regardless of industry. 

    Middle-management is the easiest position made redundant by AI and telework policies. 

  2. binaryhellstorm

    Sounds about right, it seems like most of the “AI” software that we’re seeing outside text and image generation is just any army of underpaid and abused workers being the wizard behind the curtain. The companies that are doing this are hoping they’ll hit some magical tipping point where the machine will start working the way they want, but that’s not stopping them from expanding in the mean time. Like Amazon and their Just Walk Out stores that never worked without an army of people behind the scenes doing the AI part. It hasn’t stopped them from rolling out the tech to airports, concert venues, and theme parks.

  3. TheSimham

    For upcoming generation driving a car will be rocket science.

  4. tanrgith

    That’s sorta the job if you work as a data annotator on something like autopilot, no?

  5. Colored me not surprised at all. They need so much manual data annotation to run on the models. Then they keep having issues and they never ask well why did it blow that no left in red or no u turn etc. The ai team thinks it is in the data set but to make that happen the middle mgmt is screwing with the data. I wish Tesla luck. I have two of them. I however am convinced Waymo will become the king of this hill and crush Tesla as there system is logging real level 5 miles and doing real level 5 testing. Tesla is praying for a miracle to avoid the lawsuit.

  6. AngryFace4

    I am a software quality architect. One of the roles I’ve taken on in our company is how ‘metricizing’ tasks can impact the work we do. If you haven’t heard of “Goodhart’s Law” (not a *real* law, by the way), google it, that might give a little clarity into the concept.

    There is no broad answer that we can give here, but we would want to review individual scenarios and write an analysis. Two things set off bells in my mind as needing review here:

    1. the incentive to continually be pressing keys on ones keyboard in order to abate getting a ‘sit-down’ with management

    This scenario makes me wonder how often employees are entering junk keypresses because “this one is taking too long, gotta keep moving” or to “get ahead on quota” or “because i’m going on break let me finish this real quick”

    What impact would this amount of “junk data” (if it even is junk) ultimately have on the finished product. Are these clips reviewed by 2 or more people in order to “take an average”?

    2. being told to “ignore no-U-Turn signs”

    Im not exactly sure what the context of this quote is, but I would presume data entry should accurately represent the real world when possible. Whether your system decides it should then ignore the real world is a different question, but I would think you’d want to begin with consistent and accurate recordings.

  7. Pretend-Fig-no-paint

    This as-told-to essay is based on a conversation with a Tesla employee who spoke anonymously to protect their privacy. Business Insider has verified their identity and employment and corroborated their claims during our reporting on Tesla’s Autopilot facilities. The following has been edited for length and clarity.

    My job is to help train Tesla’s vehicles to drive themselves. A Tesla has 9 different cameras that collect data the Autopilot team goes through in order to teach the Full Self-Driving and Autopilot software how to drive like a human.

    I spend hours everyday going over videos that were taken from customer cars and Tesla’s in-house test drivers. We label every little thing — from making sure a car doesn’t try to drive on the shoulder of the road to telling it how to react when a lane is closed due to construction or when there’s a four-way stop sign.

    Advertisement

    Within the program there are many different little projects that are all combined to create the greater Full Self-Driving and Autopilot experience. We work on hyper-focused projects that can last for weeks or months. For example, you could spend months labeling road lines or teaching the vehicle how to respond to different weather conditions, like how to identify snow banks and operate when there’s snow covering lane markings.

    I’ve seen clips from all over the world — and some accidents as well

    Sometimes Tesla’s cameras will capture accidents that happen nearby. It can be difficult to watch, but as soon as we run across one, we flag it to a supervisor.

    Related stories

    When I first started at Tesla, it was common for people to share clips around the office, usually just odd things they saw. But, one worker took it too far. He shared a clip of a little boy who’d been hit by a Tesla while riding his bike. I thought that was sadistic.

    Tesla cracked down on image sharing and what we could access after Reuters published a story on it. They essentially told us “If you’re caught once, that’s your ticket out the door.”

    Advertisement

    After that, you couldn’t access images outside of your allocated team folder anymore, and Tesla put watermarks on some of the images so you could easily tell where it came from, if it was redistributed. Sometimes people still pass images around the office, especially if it’s something out of the ordinary, but it doesn’t happen as often.

    There is something very strange about having this very intimate view into someone’s life. It feels odd to see someone’s daily drive, but it’s also an important part of correcting and refining the program.

    At the same time, we’re under our own microscope

    Anytime we’re making key clicks on the computer, Tesla knows what we’re doing. There are cameras basically everywhere we work so the only place you can really get any privacy is in the bathroom.

    We review about five and a half to six hours of footage per day. It can be very hard to focus. You can get in this kind of fog when you’re just watching clip after clip and it can be difficult to keep yourself sane.

    Advertisement

    Tesla gives us a 15-minute break and a 30-minute lunch break, but you have to time it perfectly because of Tesla’s employee monitoring software: Flide Time. It tracks keystrokes and how long we’re working on the images, which can make it tricky. To properly label some of the clips you have to use outside resources. So you’ll have to go out of the labeling system to review traffic laws or Tesla’s labeling policies, but anytime you’re not clicking around in the software program, it tracks you as if you aren’t working and it basically sets off an alarm to your superiors.

    If you don’t hit Flide Time — even if you were five minutes off — the next day, you will end up in a disciplinary meeting with a team lead and you’d get a point put against your record. You could be fired, if you get three points in the span of six months.

    I’ve gotten in trouble for missing Flide Time. They bring you into a separate office and ask: “Why didn’t you make any changes to the software program for 15 minutes?” You could basically get fired for spending too long in the bathroom.

    There’s definitely a feeling that we’re just worker ants.

    Advertisement

    When we had concerns they were often brushed off. There were some times we were told to ignore “No Turn on Red” or “No U-Turn” signs. Those were the kind of things that made me and my coworkers uncomfortable. In some cases, they would hear us out, but other times the general response was along the lines of “Mind your business and your pay grade.”

    My experience at Tesla has been different than I’d thought it would be when I started. I thought it would be a great opportunity for my career, but now I view it as this dystopian company.

    A spokesperson for Tesla did not respond to a request for comment.

  8. duke_of_alinor

    Hit piece and ensuing circle jerk comments.

    Things like

    >When we had concerns they were often brushed off.

    could easily be a known problem and being already worked on required no action.

    The facts should have been presented without prejudice. That would be much more convincing.

  9. arcticmischief

    > A spokesperson for Tesla did not respond to a request for comment.

    Maybe because Musk got rid of their PR department?

  10. im_thatoneguy

    I’m extremely critical of the state of FSD, but nothing about this is controversial or noteworthy. These are essentially minimum wage Captcha fillers. “Don’t worry about it” is perfectly valid advice to someone who is doing menial labor. Obviously the FSD team knows about no-turn on Red. And obviously it’s not the middle managers who made the call whether or not to label it, it was from the AI team. The fact that FSD ignores no-turn-on-red is infuriating but it’s obviously part of a larger issue since it hasn’t been addressed in 8 years.

Write A Comment