• 1 Post
  • 321 Comments
Joined 1 year ago
cake
Cake day: June 23rd, 2023

help-circle

  • My wife & I just spent a week in London, where there are plenty of cars but very little off-street parking. We saw a significant number of EV’s ranging from Tesla’s & other cars, to taxis, double decker busses, and the occasional truck/lorry. We spotted one or two Tesla super charger stations as we made our way around the city, as well as a very small number of public parking spaces along roads that had either chargers or just outlets to plug chargers into.

    What little I saw certainly didn’t seem like a lot, but they clearly seem to have some sort of grasp on the situation given the number of EV’s we saw…







  • Wary why? I work remotely in IT and manage a ton of Linux systems with it. Because my company has a large number of remote employees they limit us to Windows or Macs only, and have pretty robust MDM, security, etc. installed on them. Since MacOS is built on top of a unix kernel it’s much more intuitive to manage other unix & linux systems with it.

    Personally I haven’t used Windows really since before Windows 10 came out, and as the family tech support department I managed to switch my wife, parents, brother, and mother in-law all to Mac’s years ago as well.










  • The problem is computer vision has a LONG way to go before it’s truly on par with human eyesight. Musk loves to crow how cameras are sufficient since we use our eyes to drive.

    The thing is, eyes have special neural circuits that detect motion. They essentially filter out unnecessary information and send just the motion details to the brain. This prevents the brain from being overloaded with every detail the eye constantly sees.

    And being overloaded with everything is exactly what computer vision currently does. It’s just a stream of images that the computer must analyze completely. So it’s working exactly opposite to how the eye & brain works.