This week, I was struck by the number of regulatory type events happening in recent weeks. The push back against AI is showing up more, but its not always just an outright ban. Let’s explore some of the forces pushing back against AI this week.
Social
Social advocacy groups have submitted complaints and calls to the US and EU regulators to change AI development and launch investigations into ChatGPT. It is unclear if these complaints will drive action, and there were no responses as of this article.
What’s changing: The fight over AI and its effect on society continues to ramp up, likely to continue as it becomes more visible and prominent. The specifics are targeted at OpenAI and its public ChatGPT tool, and the possible harms it can cause users. It’s unclear if this will prompt any action, but we are seeing more voices entering the chaotic conversation.
What’s next: Even Open AI’s CEO has made statements about concerns about where AI may go, despite adding some “safety limits.” The voices and conversation about AI, and LLMs in particular, need more diversity and inclusion of those who are most likely impacted by AI. I expect we’ll see other feedback loops go into action as people and groups look for ways to push back.
Technological
Tesla has employees whose job is to review footage and images from vehicles for understanding their vehicle behaviors, but Reuters reports some of these media assets were shared internally and sometimes turning them into memes. Today, humans are still needed to help review the training of AI to help label and improve the quality of training data. Yet vehicles are a private space and park in private places, potentially violating privacy even if they were not linked to a user (but might still have geolocation data). Ultimately, the amount of data (even beyond media) can reveal a lot about people and the ramifications are complex. Yet, the data is a requirement for how we build AI today.
What’s changing: Perceptions about privacy and who owns the data are complex especially in private/public spheres. Cars may be recording people walking by the vehicle, but so do security cameras at the store or traffic lights. Not a lot of privacy exists outside of your own home (or inside of it depending on how you outfit it), but the change is how distributed the data collection and ownership is becoming. This is a long and slow change, but there are plenty of examples of people being tracked through public camera systems.
What’s next: Cars go between private and public spaces all the time, what can someone reasonably expect for privacy? What kind of regulation can you have that addresses the multitude of concerns here? Regulators have thus far been unable to come to strong conclusions so the current outlook is that it may take some major breaches of privacy to get to the point where something can change. Humans just aren’t good at managing complex and distributed change.
Economic
The economics of growth have won in India after a recent answer to parliament by the country’s Ministry of Electronics and IT. It states the country is not currently considering laws or regulation to limit growth of AI. Instead it wants AI to have a “kinetic effect” on innovation for the digital ecosystem.
What’s changing: Nothing actually, and that is the point. Given other events in this newsletter, India is trying to clarify that it will be friendly for AI development and desires growth in this sector. The notice points to previous work done in the 2018 National Strategy for AI as sufficient for current concerns. India wants to become a global leader in AI, and today’s innovation environment favors those with light oversight.
What’s next: The notice also highlights some of the enabling requirements for AI as part of a larger strategy. India has multiple efforts underway to help improve the investment in growth of data, infrastructure, and expertise/skills. Given the size of India’s technology sector, this move makes sense for their objective. Yet, will this open policy generate the value expected?
Environmental
Electric vehicles are the preferred platform for autonomous vehicles, powered by AI, but there are cases and places where EVs are being reconsidered or proposed for outright bans. Switzerland is considering banning the use of EVs during power outages, and the US state of Wyoming has a proposed bill to phase out EVs by 2035.
What’s changing: There are very different reasons for what Switzerland and Wyoming are proposing. Switzerland is considering the implications of fully electrified vehicles, and there are some pragmatic issues that arise if we don’t have backup power sources. Wyoming’s bill is a political stunt, but reminds us there are competing interests in current industries that would be affected.
What’s next: Electric vehicles are a mixed bag for the environment, while they run clean they also still require major carbon emissions in their creation and often in their charging (depending on the power grid mix), and this is a classic shifting the burden archetype. The dream to electrify everything will cause a large number of impacts (many we can anticipate if we had the desire to do something about them), and I have great doubts that we should electrify everything.
Political
Italy recently temporarily banned ChatGPT over concerns about how the tool meets EU data compliance rules. Italy is the first western nation to tackle such a ban on an AI powered chatbot. Now other European nations are connecting with Italian regulators to understand the discussions with OpenAI (company behind ChatGPT) that ultimately led to its ban.
What’s changing: I have no doubt that regulators have been busy looking at ChatGPT and the many other AI tools, but once one nation takes a major step like this the rest of the EU may follow. There is no guarantee any other countries will, as Sweden has already declared it has no plans to. The EU AI Act is still in progress, and this likely is making it more complicated to finalize.
What’s next: The reality is regulations are going to play into the future, but there are a lot of questions about who will be driving it and how it will play out. AI is advancing in both technology and implications faster than any laws can be drafted. Existing frameworks, like Europe’s GDPR, will likely be called in to carry the load of missing regulation and new interpretations will be required. Lawsuits will likely be a leading pathway for driving changes.