close
close

Palmer Luckey’s vision for the future of mixed reality

Palmer Luckey’s vision for the future of mixed reality

Silicon Valley players are ready to benefit. One of them is Palmer Luckey, founder of virtual reality headset company Oculus, which he sold to Facebook for $2 billion. After Luckey’s public firing from Meta, he founded Anduril, which specializes in drones, cruise missiles and other artificial intelligence technologies for the US Department of Defense. The company is now valued at $14 billion. My colleague James O’Donnell interview Lucky on his new favorite project: headsets for the military.

Luckey is increasingly convinced that it will be the military, not consumers, who will see the value of mixed reality equipment first: “You’ll see an AR headset on every soldier long before you see one on every civilian,” he says. In the consumer world, any headset company competes with the ubiquity and simplicity of the smartphone, but it sees entirely different security trade-offs. Read the interview here.

The use of AI for military purposes is controversial. Back in 2018, Google pulled out of the Pentagon’s Project Maven, an attempt to create image recognition systems to improve drone strikes, after employee walkouts over the ethics of the technology. (Since then Google returned to providing services for the defense sector.) There has long been a campaign to ban autonomous weapons, also known as “killer robots,” which powerful militaries like the United States have refused to agree to.

But the voices that are even louder are those of a powerful faction in Silicon Valley, such as former Google CEO Eric Schmidt, who has called on the military to embrace and invest more in artificial intelligence to gain an advantage over its adversaries. Military personnel around the world were very receptive to this message.

This is good news for the technology sector. Let’s start with the fact that military contracts are long-term and lucrative. Most recently, the Pentagon acquired services from Microsoft and OpenAI for search, natural language processing, machine learning and data science, it said. Interception. In an interview with James Palmer, Luckey says the military is an ideal testing ground for new technologies. Soldiers do what they’re told and aren’t as picky as consumers, he explains. They are also less price sensitive: the military doesn’t mind spending more to get the latest version of technology.

However, there is a serious danger in prematurely introducing powerful technologies into such high-risk areas. The Foundation’s models pose serious threats to national security and privacy, such as through the leaking of sensitive information, researchers from the AI ​​Now Institute and Meredith Whittaker, president of the communications privacy organization Signal, argue in their paper. new paper. Whittaker, who was a key organizer of the Project Maven protests, said the push to militarize AI is actually more about enriching tech companies than improving military operations.

Despite calls for stricter transparency rules, we are unlikely to see governments restrict their defense sector in any meaningful way beyond voluntary ethical obligations. We live in an era of experimentation with artificial intelligence, and the military is playing with the highest stakes. And because of the secretive nature of the military, tech companies can experiment with technologies without the need for transparency or even much accountability. This suits Silicon Valley quite well.


Deeper learning

How Wayve’s self-driving cars will solve one of our biggest challenges