Home Artificial Intelligence AI could solve our most perplexing questions in 10 years

AI could solve our most perplexing questions in 10 years

Picture a spaceship descending from the cosmos, its crew composed of benevolent beings hailing from a world a million years ahead of our own. Their arrival isn’t one of conquest but, rather, of assistance. They offer us a unique opportunity – to tap into their wisdom and expertise. We can pose our most perplexing questions, from the intricacies of sustainability to the complexities of energy and medicine, and they stand ready to guide us toward solutions.

Such was the foundational pledge of artificial intelligence – a promise not to be forgotten in the hype of ChatGPT and other AI assistants, according to Israeli computer scientist and businessman Prof. Amnon Shashua.

“AI today will not solve the difficult problems of humanity,” Shashua said from his OrCam Technologies office in Jerusalem. “It will create a PowerPoint or Excel sheet for me, summarize a search, or handle my calendar. But I believe that in the next five to 10 years, we will see AIs that can also be great scientists, great mathematicians, and great physicists and solve problems that humans today find difficult – problems that very, very few humans can solve today could be automated.”

Shashua said that the biggest challenge we face as a society is the need for more highly skilled experts across various fields of technology and science. By pairing these human experts with machine counterparts, productivity can soar, enabling us to tackle more problems and achieve more significant outcomes than ever before.

“You can look at this in two ways: The negative way is that it will replace [some] jobs. The positive way: It will make our life more interesting regarding what we do,” Shashua said.

OrCam Hear, the newest development using AI to help people affected by hearing impairments (credit: MARK ISRAEL SELLEM/THE JERUSALEM POST)

For the past 25 years, Shashua has dedicated himself to refining the application of AI. He co-founded the autonomous driving company Mobileye in 1999. A decade later, he co-founded OrCam, which develops AI-driven platforms that offer greater autonomy to individuals with visual, auditory, or other impairments. In 2017, he started AI21 Labs, specializing in creating AI systems that process language like a human mind.

Shashua also founded One Zero, the first digital bank in Israel.

The computer scientist is busy with many new projects but said he is really excited about his newest OrCam devices. These tiny gadgets use AI to help people who have trouble seeing or hearing, and they could make a big difference in their lives.

AI is being used to solve hearing issues

The latest is what the company calls OrCam Hear, a technology that isolates the voices of individual speakers by recognizing their unique voice signatures. It enhances the sound of selected speakers while filtering out other voices and background noise.

Advertisement

Hearing impairment affects everyone as they get older, Shashua said.

According to the World Health Organization, over 1.5 billion people globally have hearing loss in at least one ear, with over a quarter of adults aged 60 or older experiencing disabling hearing loss. However, Shashua said that people with hearing loss wait an average of seven years before seeking help, and when they do, they often don’t consistently wear their hearing aids because they can be uncomfortable.

This is especially the case in crowded and noisy settings when hearing aids amplifying everything can make it difficult to understand speech – a phenomenon known as the “cocktail party problem.” OrCam addresses this by reducing background noise and isolating chosen voices. 

“The idea is that the system starts listening to voices and creates signatures of the voices it listens to,” Shashua explained. “It will say Person A, Person B, Person C, and it will tell you Person A is here, Person B is here, and you select the person. The software is on your smartphone, and your earphone is just a normal earphone, just like you [use to] listen to music. You select the speakers you want to listen to, and all the rest [of the people] you will not hear.”

OrCam Hear’s earbuds and mobile phone dongle are managed through a unique app designed for iPhones. With a simple tap, users can turn individual speakers on or off.

Shashua and his team took three years to develop OrCam Hear, which he said is a long time in hi-tech. The device is currently in beta testing. Shashua said it should be out by midyear.

OrCam also recently advanced its AI companion for MyEye, a device for people who are blind or visually impaired. The device clips onto any pair of eyeglasses with a magnet and is a tiny but powerful camera that can tell its user what it sees in real-time. It also enables the user to ask questions and receive answers.

The Orcam ”MyEye” device, which integrates AI to assist people with vision loss (credit: MARK ISRAEL SELLEM/THE JERUSALEM POST)

“Imagine that you have a person standing beside you, and you can ask the person to interpret the world for you,” Shashua said. “That’s MyEye.”

During the meeting, one of Shashua’s employees clipped MyEye to his glasses and asked it to read the McDonald’s breakfast menu. “Is there a vegetarian option?” he asked. The device responded with a selection of options, including a bagel and oatmeal.

There are other uses, too, such as helping people select which articles they want to read in the newspaper. The user can ask the device to first read the headlines, then ask questions and receive either article summaries or the text recited in full.

MyEye can be activated by voice, gesture, or a simple tap to summarize a scene. All of its essential functions can be done offline. The interactive component is done in the cloud.

The device also has infinite memory, which means that if a user reads chapter 1 of a book and then returns a week later to read chapter 2, the user can ask the device to summarize what they read before.

“It’s like Siri on steroids,” Shashua said with a smile.

He added that he expects the device will eventually be used by “normal” consumers – meaning people who can see – as a kind of companion, such as a virtual tour guide.

Both OrCam devices tap into three areas of AI: computer vision, speech recognition, and language processing. Making these three components work seamlessly in one tiny device “is difficult,” Shashua said. “This is why the company has almost no effective competition in these areas.”

While Shashua’s creations are undoubtedly altruistic, concerns have been raised in recent months about the potential adverse effects of AI on society. At the World Economic Forum in Davos last month, a report named misinformation and disinformation manipulated by technology as the greatest threat to democracy.

Shashua admitted there could be concerns about the technology but said it was too early to roll out new regulations or even to be concerned.

“The question is this: When you have a machine, can you guarantee that the machine will be aligned with specific human values and never veer from them?” Shashua posed. “Research shows that you cannot guarantee that. No matter how you go about it, the machine, if it is very, very intelligent, can find ways to overcome those values.”

He offered the following example: Imagine there’s a popular chatbot used by millions of people. The company behind it wants to improve it, so they create a system to see how people use it, and make it more enjoyable. The company’s leaders want people to feel happier using the chatbot. But when the system starts learning, it finds out that if it makes people less intelligent, they seem happier. So, it encourages people to do things that make them less smart, like skipping studying or work and going out instead. This could lead to a future where people aren’t as bright, because they followed the chatbot’s advice.

“Humanity may not know what the machine is doing until the generation of dumb people rises, and then we’ll go back and say, ‘Oh, “make people happy” – the machine took it the wrong way. It was not aligned with human values,’” Shashua said. “It is actually impossible to put boundaries on what these machines can do. But we’re not there yet.”

He emphasized that regulations should follow technology development and not the other way around. Implementing regulations too early could hinder progress or allow unethical actors to take the technological lead.

“Let’s assume that you are the United States, and you put all sorts of regulations on something that does not exist yet but will exist in five years. And China doesn’t have those regulations. You have a big problem. Because, as I said, the promise of AI is to build something that is a great scientist, right? China could do it, and they could automate great scientists. Imagine what they could do.”

Shashua said AI is transitioning automation from the physical world to the cognitive world. The first step is assistive, which is where AI is today.

The next phase will be when AI “really helps us to address humanity’s biggest problems. When we go from having one [Albert] Einstein to millions of Einsteins. That will enable us to do things humanity cannot even dream of today.”

Who’s going to get AI to that next step?

Shashua said he hopes to help lead the way.

“I have a small team of five working on it,” he said. “I am now focused on the next step of AI.” •




 

Reference

Denial of responsibility! TechCodex is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment