That might have people asking: Wait, what? But these grand worries are rooted in research. Along with Hawking and Musk, prominent figures at Oxford and UC Berkeley and many of the researchers working in AI today believe that advanced AI systems, if deployed carelessly, could end all life on earth.
This concern has been raised since the dawn of computing. But it has come into particular focus in recent years, as advances in machine-learning techniques have given us a more concrete understanding of what we can do with AI, what AI can do for (and to) us, and how much we still don’t know.
There are also skeptics. Some of them think advanced AI is so distant that there’s no point in thinking about it now. Others are worried that excessive hype about the power of their field might kill it prematurely. And even among the people who broadly agree that AI poses unique dangers, there are varying takes on what steps make the most sense today.
The conversation about AI is full of confusion, misinformation, and people talking past each other — in large part because we use the word “AI” to refer to so many things. So here’s the big picture on how artificial intelligence might pose a catastrophic threat, in nine questions:
Artificial intelligence is the effort to create computers capable of intelligent behavior. It is a broad catchall term, used to refer to everything from Siri to IBM’s Watson to powerful technologies we have yet to invent.
Some researchers distinguish between “narrow AI” — computer systems that are better than humans in some specific, well-defined field, like playing chess or generating images or diagnosing cancer — and “general AI,” systems that can surpass human capabilities in many domains. We don’t have general AI yet, but we’re starting to get a better sense of the challenges it will pose.
Narrow AI has seen extraordinary progress over the past few years. AI systems have improved dramatically at translation, at games like chess and Go, at important research biology questions like predicting how proteins fold, and at generating images. AI systems determine what you’ll see in a Google search or in your Facebook Newsfeed. They are being developed to improve drone targetingand detect missiles.
But narrow AI is getting less narrow. Once, we made progress in AI by painstakingly teaching computer systems specific concepts. To do computer vision — allowing a computer toidentify things in pictures and video — researchers wrote algorithms for detecting edges. To play chess, they programmed in heuristics about chess. To do natural language processing (speech recognition, transcription, translation, etc.), they drew on the field of linguistics.
But recently, we’ve gotten better at creating computer systems that have generalized learning capabilities. Instead of mathematically describing detailed features of a problem, we let the computer system learn that by itself. While once we treated computer vision as a completely different problem from natural language processing or platform game playing, now we can solve all three problems with the same approaches.
We associate positive feelings with foods and experiences we enjoy, and negative feelings with the opposite. One new study dives deeper than ever before into why.
HOW TO FEEL
A new study has mapped out in unprecedented detail the “neighborhoods” of the brain that assign good and bad feelings to objects and experiences. Led by MIT neuroscientist Kay Tye, the research is illuminating brain processes that neuroscientists still don’t understand, and could have implications for treating mental health disorders.
In 2016, Tye’s research team found that within the amygdala — the center for emotions in the brain — there are neurons that assign good or bad feelings known as “valence.” These responses are integral to human survival; it is vitally important that we remember what foods or other experiences are good, and what are bad and could sicken or kill us. The new study, published in the journal Cell Reports, more deeply explores the inner workings of valence by focusing on a particular section of the amygdala, the basolateral amygdala.
The team, led by lead author Anna Beyeler, trained mice to associate “good” tasting sucrose drops with a certain audible tone, and bitter quinine drops with a different tone. They later recorded the neural responses of the mice when the different tones were played, to see which valence they were conditioned to express. They then identified and manipulated those neurons identified to play a key role in valence and engineered them to respond to light pulses. This allowed them to record the electrical activity of the neurons and their nearby agents, revealing what influenced local circuits and how.
By looking at these interactions and system structures close up, the team found that within the basolateral amygdala region, there are distinct and diverse “neighborhoods,” in which valence is determined through connections to other regions in the brain and interactions with the basolateral amygdala itself.
At the end of the experiment, the team had mapped over 1,600 neurons. Within these, they highlighted three different types of neurons that project to different parts of the brain and are associated with different types of valence. The team also found that different types of neurons tend to group together in “hotspots.” However, despite these tight groupings, they also noted that different types of neurons often mixed together.
The future prospective applications of this research are undefined as of yet. Yet there are hopes that by understanding how the brain processes good and bad experiences, scientists could better understand certain mental health issues and addiction.Additionally, the researchers found that depending on the type of neuron, they have different abilities to influence one another. Tye stated in a press release that this could be due to the mixing the observed: “Perhaps the intermingling that there is might facilitate the ability of these neurons to influence each other.”
“Perturbations of emotional valence processing is at the core of many mental health disorders,” said Tye in the press release. “Anxiety and addiction, for example, may be an imbalance or a misassignment of positive or negative valence with different stimuli.”
Even beyond this, there is theoretical potential in manipulating feelings or desires through control of these neurons and networks. Researchers have not suggested any plans to use the research this way, but it is certainly not out of the question.
Antibiotic resistance is an issue that affects countless people all around the world, and it’s only getting worse. A newly developed algorithm helps scientists find variants of known antibiotics to support the fight against resistance.
HUNTING FOR ANTIBIOTICS
Antibiotic resistance is a growing issue in which harmful bacteria in the body are no longer receptive to the effects of antibiotics. Because of this issue, more and more patients struggle with everything from common illnesses to much more severe bacterial infections that could cause life-threatening harm.
One technique that could combat antibiotic resistance is finding variants of known antibiotics, or peptidic natural products (PNPs). Unfortunately, finding these variants has been an arduous and time-consuming process. Until now: a group of American and Russian computer scientists has created an antibiotic algorithm that, by rapidly sorting through databases, can discover 10 times more new variants of PNPs than all previous efforts combined.
The algorithm, known as VarQuest, is described in the latest issue of the journal Nature Microbiology. Hosein Mohimani, an assistant professor in Carnegie Mellon University’s Computational Biology Departmen, said in a press release that VarQuest completed a searchthat could have taken hundreds of years of computations traditionally.
Mohimani also said that the study expanded their understanding of the microbial world. Not only does finding more variants quickly increase researchers’ ability to formulate alternative antibiotics; it can also provide vital information to microbiologists.
“Our results show that the antibiotics produced by microbes are much more diverse than had been assumed,” said Mohimani in a press release.
Mohimani noted that, because VarQuest was able to find over one thousand variants and in such a short amount of time, it could give microbiologists a larger perspective, perhaps alerting them to trends or patterns that wouldn’t otherwise be noticeable.
VarQuest’s success stands on the shoulders of computing progress made within the past few years. High-throughput methods have advanced, allowing samples to be processed in batches instead of one at a time, making the process much faster. Additionally, the effort has been supported by the Global Natural Products Social (GNPS) molecular network, launched in 2016. This is a database in which researchers from around the world collect data on natural products based on their mass spectra, the chemical analysis of how charge is distributed through a substance’s mass. Using this database alongside VarQuest could drastically enhance drug discovery abilities.
“Natural product discovery is turning into a Big Data territory, and the field has to prepare for this transformation in terms of collecting, storing and making sense of Big Data,” Mohimani said of this growing data and scientists’ ability to access it. “VarQuest is the first step toward digesting the Big Data already collected by the community.”
This issue is only going to get worse if steps are not taken to prevent resistance in the first place. However, as solutions are crafted, working antibiotics are still needed for those facing both resistance and infection. This antibiotic algorithm will be a critical tool in mitigating the effects of resistance while also giving microbiologists a big-picture view, hopefully propelling research forward.Every time a person consumes an antibiotic, evolution pushes bacterial species to develop resistance and multiply. Children and elderly adults have the highest rates of antibiotic use, but are also higher-risk groups to begin with, making this a particularly concerning issue with them. Many partially attribute the drastic growth of the resistance problem to the needless prescription of antibiotics to patients with viral illnesses, for whom antibiotics would have no effect but to create resistance.
A team of engineering graduates has won the prestigious James Dyson Award for their cheap and portable device that can detect melanoma, a form of skin cancer. The device could potentially save thousands of lives, as skin cancer is the most common type worldwide.
Detecting skin cancer early isn’t easy. Currently, it’s done through visual inspections or biopsies, but some doctors may not pick up on the disease using the former, while some patients may not be able to afford the latter. As such, a team of graduates from McMaster University in Canada set out to develop an inexpensive skin cancer detector, and their innovative work has earned them the prestigious international James Dyson Award.
Cancer affects the metabolic rate of skin cells, with cancerous cells heating up faster than their healthy counterparts following a shock of cold temperature.
To make identifying these cells easier, the McMaster University team — Michael Takla, Rotimi Fadiya, Prateek Mathur, and Shivad Bhavsar — built a skin cancer detector with 16 thermistors that can track the rate of temperature increase following a cold shock from an ice pack.
The thermistors are simply placed on the potentially cancerous area of skin, and the device produces a heat map that can be used to determine the presence of melanoma
“By using widely available and inexpensive components, the sKan allows for melanoma skin cancer detection to be readily accessible to the many” award founder James Dyson said in a statement announcing the win. “It’s a very clever device with the potential to save lives around the world.”
In addition to winning the Dyson Award for their skin cancer detector, the team also received a cash prize of approximately $40,000 to advance their research. They received $10,000 at the the Forge’s Student Start-up Pitch competition in March.
DIAGNOSING SKIN CANCER
According to Mathur, the team was inspired to create sKan after realizing technology hadn’t had the same impact on skin cancer diagnosis as it had on other medical fields.
“We found research that used the thermal properties of cancerous skin tissue as a means of detecting melanoma. However, this was done using expensive lab equipment,” he said in a McMaster University news release. “We set out to apply the research and invent a way of performing the same assessment using a more cost-effective solution.”
Going forward, the sKan team hopes to create a more advanced prototype that will allow them to begin pre-clinical testing.
As reported by The Guardian, nearly 39 people are diagnosed with skin cancer every day in the U.K., and the American Cancer Society (ACS) estimates 87,110 new cases of melanoma will be diagnosed in the U.S. 2017, with 9,730 people dying from the condition. Early detection is key to cancer survival, so if sKan succeeds, it could significantly reduce that number.
“Our aspirations have become a reality,” said Mathur. “Skin cancers are the most common form of cancer worldwide, and the potential to positively impact the lives of those affected is both humbling and motivating.”
To make it easier for people in the United Kingdom to spend their various cryptocurrencies, startup London Block Exchange is launching a new Visa debit card called the Dragoncard. It pays the retailer in pounds, then takes money from the consumer’s crypto wallet.
Cryptocurrencies such as ether and bitcoin are surging in popularity thanks to their many benefits over traditional currencies, but they still lag behind those currencies in one key way: they are not easy to spend in physical stores. People can spend USD and euros using a plethora of debit, credit, and gift cards, but their options are severely limited when it comes to spending bitcoin or ether using a cryptocurrency debit card.
That’s starting to change, though. The Centra Card can be used just like a debit card to spend bitcoin, ether, dash, and several other popular cryptocurrencies. Token Card is another cryptocurrency debit card, and soon, London startup London Block Exchange (LBX) will launch a prepaid Visa debit card that will act in the same fashion.
The Dragoncard will allow people to convert their bitcoin, ether, ripple, litecoin, and monero to sterling (aka the British pound) at the time of purchase, thereby making it significantly easier for those currencies to be spent in stores throughout the United Kingdom, including ones that have yet to accept alternative forms of payment.
Business Insider reports the cryptocurrency debit card will be issued by pre-paid card provider Wavecrest, and it comes alongside an app people can use to buy and manage cryptocurrencies on LBX’s own exchange. When someone uses the Dragoncard, LBX will pay the retailer in pounds first, then take the equivalent amount from the shopper’s cryptocurrency wallet.
Before rushing off to get a Dragoncard when it debuts in December, though, interested crypto owners should know a few things. First, the card itself is £20 ($26.33). Second, they will be charged a 0.5 percent fee whenever they buy or sell cryptocurrencies on LBX’s platform. Lastly, provider Wavecrest charges a small fee for ATM withdrawals — it is a debit card, after all.
THE PATH TO ACCEPTANCE
Despite the fees, the Dragoncard and other cryptocurrency debit cards have the potential to help crypto become widely accepted and, more importantly, understood.
The Dragoncard also arrives at a time when bitcoin is experiencing quite a growth spurt. With schools, companies, and even nations starting to embrace bitcoin, the currency is poised to continue increasing in value and popularity, and with the Dragoncard, LBX is hoping to help Londoners join that ever-growing segment of crypto supporters.
“Despite being the financial capital of the world, London is a difficult place for investors to enter and trade in the cryptocurrency market,” LBX founder and CEO Benjamin Dives reportedly said in a statement. “We’ll bring it into the mainstream by removing the barriers to access, and by helping people understand and have confidence in what we believe is the future of money.”
“We’re offering a grown up and robust experience for those who wish to safely and easily understand and invest in digital currencies,” said LBX’s executive chairman Adam Bryant. “We’re confident we’ll transform this market in the U.K. and will become the leading cryptocurrency and blockchain consultancy for institutional investors and consumers alike.”
Disclosure: Several members of the Futurism team, including the editors of this piece, are personal investors in a number of cryptocurrency markets. Their personal investment perspectives have no impact on editorial content.
During the mid- to late-twentieth century, quantum physicists picked apart the unified theory of physics that Einstein’s theory of relativity offered. The physics of the large was governed by gravity, but only quantum physics could describe observations of the small. Since then, a theoretical tug-o-war between gravity and the other three fundamental forces has continued as physicists try to extend gravity or quantum physics to subsume the other as more fundamental.
Recent measurements from the Large Hadron Collider show a discrepancy with Standard Model predictions that may hint at entirely new realms of the universe underlying what’s described by quantum physics. Although repeated tests are required to confirm these anomalies, a confirmation would signify a turning point in our most fundamental description of particle physics to date.
Quantum physicists found in a recent study that mesons don’t decay into kaon and muon particles often enough, according to the Standard Model predictions of frequency. The authors agree that enhancing the power of the Large Hadron Collider (LHC) will reveal a new kind of particle responsible for this discrepancy. Although errors in data or theory may have caused the discrepancy, instead of a new particle, an improved LHC would prove a boon for several projects on the cutting edge of physics.
The Standard Model
The Standard Model is a well-established fundamental theory of quantum physics that describes three of the four fundamental forces believed to govern our physical reality. Quantum particles occur in two basic types, quarks and leptons. Quarks bind together in different combinations to build particles like protons and neutrons. We’re familiar with protons, neutrons, and electrons because they’re the building blocks of atoms.
The “lepton family” features heavier versions of the electron — like the muon — and the quarks can coalesce into hundreds of other composite particles. Two of these, the Bottom and Kaon mesons, were culprits in this quantum mystery. The Bottom meson (B) decays to a Kaon meson (K) accompanied by a muon (mu-) and anti-muon (mu ) particle.
They found a 2.5 sigma variance, or 1 in 80 probability, “which means that, in the absence of unexpected effects, i.e. new physics, a distribution more deviant than observed would be produced about 1.25 percent of the time,” Professor Spencer Klein, senior scientist at Lawrence Berkeley National Laboratory, told Futurism. Klein was not involved in the study.
This means the frequency of mesons decaying into strange quarks during the LHC proton-collision tests fell a little below the expected frequency. “The tension here is that, with a 2.5 sigma [or standard deviation from the normal decay rate], either the data is off by a little bit, the theory is off by a little bit, or it’s a hint of something beyond the standard model,” Klein said. “I would say, naïvely, one of the first two is correct.”
To Klein, this variance is inevitable considering the high volume of data run by computers for LHC operations. “With Petabyte-(1015 bytes)-sized datasets from the LHC, and with modern computers, we can make a very large number of measurements of different quantities,” Klein said. “The LHC has produced many hundreds of results. Statistically, some of them are expected to show 2.5 sigma fluctuations.” Klein noted that particle physicists usually wait for a 5-sigma fluctuation before crying wolf — corresponding to roughly a 1-in-3.5-million fluctuation in data.
These latest anomalous observations do not exist in a vacuum. “The interesting aspect of the two taken in combination is how aligned they are with other anomalous measurements of processes involving B mesons that had been made in previous years,” Dr. Tevong You, co-author of the study and junior research fellow in theoretical physics at Gonville and Caius College, University of Cambridge, told Futurism. “These independent measurements were less clean but more significant. Altogether, the chance of measuring these different things and having them all deviate from the Standard Model in a consistent way is closer to 1 in 16000 probability, or 4 sigma,” Tevong said.
Extending the Standard Model
Barring statistical or theoretical errors, Tevong suspects that the anomalies mask the presence of entirely new particles, called leptoquarks or Z prime particles. Inside bottom mesons, quantum excitations of new particles could be interfering with normal decay frequency. In the study, researchers conclude that an upgraded LHC could confirm the existence of new particles, making a major update to the Standard Model in the process.
“It would be revolutionary for our fundamental understanding of the universe,” said Tevong. “For particle physics […] it would mean that we are peeling back another layer of Nature and continuing on a journey of discovering the most elementary building blocks. This would have implications for cosmology, since it relies on our fundamental theories for understanding the early universe,” he added. “The interplay between cosmology and particle physics has been very fruitful in the past. As for dark matter, if it emerges from the same new physics sector in which the Zprime or leptoquark is embedded, then we may also find signs of it when we explore this new sector.”
The Power to Know
So far, scientists at the LHC have only observed ghosts and anomalies hinting at particles that exist at higher energy levels. To prove their existence, physicists “need to confirm the indirect signs […], and that means being patient while the LHCb experiment gathers more data on B decays to make a more precise measurement,” Tevong said. “We will also get an independent confirmation by another experiment, Belle II, that should be coming online in the next few years. After all that, if the measurement of B decays still disagrees with the predictions of the Standard Model, then we can be confident that something beyond the Standard Model must be responsible, and that would point towards leptoquarks or Zprime particles as the explanation,” he added.
To establish their existence, physicists would then aim to produce the particles in colliders the same way Bottom mesons or Higgs bosons are produced, and watch them decay. “We need to be able to see a leptoquark or Zprime pop out of LHC collisions,” Tevong said. “The fact that we haven’t seen any such exotic particles at the LHC (so far) means that they may be too heavy, and more energy will be required to produce them. That is what we estimated in our paper: the feasibility of directly discovering leptoquarks or Zprime particles at future colliders with higher energy.”
Quantum Leap for the LHC
Seeking out new particles in the LHC isn’t a waiting game. The likelihood of observing new phenomena is directly proportional to how many new particles pop up in collisions. “The more the particle appears the higher the chances of spotting it amongst many other background events taking place during those collisions,” Tevong explained. For the purposes of finding new particles, he likens it to searching for a needle in a haystack; it’s easier to find a needle if the haystack is filled with them, as opposed to one. “The rate of production depends on the particle’s mass and couplings: heavier particles require more energy to produce,” he said.
This is why Tevong and co-authors B.C. Allanach and Ben Gripaios recommend either extending the LHC loop’s length, thus reducing the amount of magnetic power needed to accelerate particles, or replacing the current magnets with stronger ones.
According to Tevong, the CERN laboratory is slated to keep running the LHC in present configuration until mid-2030s. Afterwards, they might upgrade the LHC’s magnets, roughly doubling its strength. In addition to souped-up magnets, the tunnel could see an enlargement from present 27 to 100 km (17 to 62 miles). “The combined effect […] would give about seven times more energy than the LHC,” Tevong said. “The timescale for completion would be at least in the 2040s, though it is still too early to make any meaningful projections.”
If the leptoquark or Z prime anomalies are confirmed, the Standard Model has to change, Tevong reiterates. “It is very likely that it has to change at energy scales directly accessible to the next generation of colliders, which would guarantee us answers,” he added. While noting that there’s no telling if dark matter has anything to do with the physics behind Zprimes or leptoquarks, the best we can do is seek “as many anomalous measurements as possible, whether at colliders, smaller particle physics experiments, dark matter searches, or cosmological and astrophysical observations,” he said. “Then the dream is that we may be able to form connections between various anomalies that can be linked by a single, elegant theory.”