Category: Uncategorized

07 Jul 2019
CAN YOU MAKE INNOVATION HAPPEN?

Can You Make Innovation Happen?

Companies used to stay competitive by being reliable. They provided the tried and true. Customers valued companies that reduced the risk in their lives. But the technology boom turned that completely around. Now almost every industry must contend with the need to innovate. Customers want products and services that are high quality but also the latest and newest vs. the tried and true.

This demand for innovation has sent companies and their leaders into a tailspin trying to figure out how to make innovation happen. I’m often brought in to help them try to solve this dilemma. But the hardest thing for them to hear is that you can’t make innovation happen. There, I’ve said it.

So now what? Well, actually, there’s a lot that can be done. But the focus isn’t about making innovation happen. It should be on making innovation probable. And there’s quite a bit a company and its leadership can do for that.

Some key opportunities, that range anywhere from the simple to the complex, include the following:

Take the risk out of risk taking. One of the biggest challenges companies tackle is their fear of failure and mistakes. A great way to do that is to put it right out in the open. From the CEO to the frontline employee, creating a dialogue that tackles that fear head on helps demystify what it takes to make risk taking work for their culture and goals.

This includes sharing lessons learned, clarifying priorities, encouraging a growth mindset and focusing on the value from lessons learned.

Make risk taking more predictable. When leadership discusses how to mitigate risk it helps set a clearer path on how to navigate all the gray area of risk taking. This includes sharing a method for how to propose new ideas, build a business case and conduct low risk trial runs. When people get to take the risk out of sharing their ideas, they are more likely to focus on the risk of genuinely out of the box thinking vs. avoiding rejection or having their reputation ruined.

Get people sharing ideas. Imagination tends to have a fantastic domino effect when shared with others. One out there idea begets another out there idea, until you end up with a genius idea. This is often attempted through the act of group brainstorming. It’s a great concept, in theory. But where it often falls apart is in the execution. Too often the brainstorming sessions become a one or two-person show. Original ideas can get stamped out by group think and seeking approval.

One solution is better facilitated brainstorming sessions. Another option is leveraging collaboration software. Software tools make collaboration independent of time and place, and they also help focus and guide the collaboration to be more productive towards what the company is trying to achieve. Viima Solutions is an example of that kind of software. They focus on providing tools that help facilitate sharing of ideas wherever and whenever.

Measure what’s working and let go of what isn’t. Part of what makes innovation so elusive is people sit around assuming they’ll know what’s innovative or not. But what separates an interesting idea from a truly innovative one is the level of impact it has on the company’s bottom line. If customers don’t care about your idea, then does it matter?

If you know what to measure for, you will be better prepared to gauge whether the issue is the quality of the idea or the readiness of the customer. The latter calls for a phased approach, looking for early adopters and building momentum. The former calls for a post mortem and return to the drawing board. Key things to consider measuring include the effectiveness of collaboration efforts, impact on brand differentiation and consumer behavior.

Have a holistic approach. Though Viima Solutions makes their bread and butter on companies that use their software, they’re the first to admit that the biggest mistake is to think that innovation is easy, or something that can be achieved with a couple of quick superficial projects like introducing a new software tool or organizing a couple of idea challenges. These kinds of tools and methods are essential for driving sustainable results within the organization but won’t lead to innovation in and of themselves.

What’s ultimately needed is a holistic and determined effort that combines all the key aspects of innovation management: strategy, culture, structures and capabilities. The right tools certainly help across all of these factors, and in putting it all together, but you’re still going to need to put in the work to get all of those different aspects right.

Source: https://www.forbes.com/sites/hvmacarthur/2019/07/03/can-you-make-innovation-happen/#2f017db1b89c

 

27 Apr 2019
10 Major Barriers Hindering The Digitalization Of Chronic Conditions

10 Major Barriers Hindering The Digitalization Of Chronic Conditions



Hundreds of millions of patients are affected by chronic diseases globally, often suffering from several of them. Despite such a huge opportunity, the coverage of chronic conditions with digital solutions is still below the acceptable level. Today, more than twenty common chronic conditions have its own specificity and are currently at its digital development stage, yet all of them face the same challenges in the digital health market, according to a recent Research2Guidance report.

1.  Lack of Consumer Capital

All chronic conditions, apart from digital diabetes, still lack the consumer focus in digital offerings. The main goal of the consumer-oriented approach is to provide and promote a solution that empowers patients to better manage their disease.

2.  Low add value

The existing chronic disease apps often create little added value for end users. Most of the currently available solutions offer either basic educational advice or simple tracking based on manual or automated data input.

3.  Missing business models for consumers and payers

In the digital health market, payer organizations are now pushing digital services to their member base; however, in the majority of chronic conditions, the reimbursement of digital services is still an issue. Apart from the digital diabetes market, consumer bundle subscription offers or payer models (implemented on the pay-per-member-per-month basis) hardly exist. Digital solution providers should pay more attention to creating reimbursement-oriented packages, which can potentially bundle services, digital content, medical products, and accessories.

4.  Insufficient cost-saving evidence for payers

There are many clinical studies across all major chronic conditions that have generated evidence to support efficiency and efficacy of digital solutions in assisting chronic disease patients. What is currently missing is the link to cost savings, which hinders a larger engagement of payer organizations.

5.  Unfavorable demographic and behavioral characteristics of patient populations

Patient population characteristics can have significant implications for digitalizing disease management. Solution providers have to find appropriate strategies to increase patient control despite these demographic and behavioral barriers. The development of caregiver solutions is one of the possible responses to this problem. Currently, nearly all of the available solutions are designed for patients or doctors, whereas the role of caregivers in patient management is still underestimated.

6.  Non-regular use of digital solutions

Chronic conditions are rather different in terms of usage regularity. In conditions with a limited necessity of regular measurements, the creation of passive monitoring solutions, such as wearable-based measurements, could be one of the means to ensure a constant use of digital solutions.

7. Low connectivity level

Although the use of digital apps in managing chronic conditions is increasing, it is not accompanied by significant growth in the use of connected medical devices, such as blood glucose meters, spirometers, blood pressure meters, etc.

8. Missing services for comorbidities

All chronic conditions have numerous comorbidities, which affect patients no less than diagnosed chronic disease. In many cases, patients suffer from several closely-related chronic conditions at a time. However, solution providers currently focus on primary chronic diseases, offering no services for such conditions as anxiety, depression, obesity, etc. As a result, many chronic condition offerings remain single use case solutions, which do not fully address patients’ needs. Weight-loss, hypertension, and diabetes solution providers are already expanding their primary use cases to other conditions, whereas respiratory conditions still lag behind them

9. Low communication support between medical professionals and patients

Communication between healthcare professionals and patients is seen as the most valuable feature within digital health solutions to drive user engagement and retention. Current digital respiratory solutions offer population management features for HCPs or PDF report generation but do not support direct HCP-patient interaction via chat, email or video call.

10. Dependency on slow regulatory processes

Many digital disease management solutions are treated as medical devices under local regulations, if they include regulated components, such as devices and/or medical products. To avoid extensively long time-to-market, services and/or connected devices can be split into regulated (such as a bolus calculator in diabetes solutions) and a non-regulated (such as behavior change features) components.

Research2Guidance states these barriers is the prerequisite for the success of digital health solutions designed for managing chronic conditions. The market has already seen several successful cases when digital solution providers went beyond the obstacles that were perceived as “natural”.

Source:
https://hitconsultant.net/2019/04/15/barriers-digitalization-chronic-conditions/#.XMQB5ujHxPY



21 Mar 2019
WHY EVERY COMPANY NEEDS AN ARTIFICIAL INTELLIGENCE (AI) STRATEGY FOR 2019

Why Every Company Needs An Artificial Intelligence (AI) Strategy For 2019

There’s no doubt that artificial intelligence (AI) is a transformative technology – perhaps even the most transformative technology available today. But if you think the transformative nature of AI is limited to global tech giants and blue-chip companies, think again. AI is ultimately going to transform every business, in every industry. 

Perhaps you read that last sentence and thought to yourself, well, not my businessMy retail business [or HR consultancy, B2B service provider, fashion design business, disaster relief charity, football club or whatever] has nothing to do with AI. I repeat, think again. Even if you can’t yet imagine how AI will impact your organisation, trust that, in the not-too-distant future, it most definitely will.  

That’s why every company needs an AI strategy.  

Like any business transformation, if you want to get the most out of AI, it all starts with strategy. Your AI strategy will help you to focus on your core business objectives and prioritise ways that AI can help deliver those business goals. 

In general, there are two ways businesses are using AI to drive success: 

  • Creating intelligent products and services 
  • Designing intelligent business processes  

Let’s look at these two uses in a little more detail.  

Intelligent products and services  

AI is, at heart, about making machines smarter, so that they can think and act like humans (or even better). We need only look at the popularity of devices like smart phones, smart fitness trackers and smart thermostats to see how consumers wholeheartedly embrace products and services that can make their life easier, smarter, more streamlined, more connected.  

Source: https://www.forbes.com/sites/bernardmarr/2019/03/21/why-every-company-needs-an-artificial-intelligence-ai-strategy-for-2019/#2a0311c868ea

10 Mar 2019
Business leaders love AI. In theory, that is

Business leaders love AI.

In theory, that is of hundreds of AI start-ups examined over the past few years, ‘very few companies are building unambiguously labour-replacing technologies’

Microsoft has unveiled the results of a survey of business leaders on the topic of artificial intelligence (AI). The findings are surprising: German and Russian entrepreneurs and executives appear to come out ahead of those from the US and other advanced European economies when it comes to adopting the technology.

Mostly, however, this and several other studies confirm a frustrating problem: the AI hype is making it impossible to figure out how much businesses really need it and are using it.

The 800 respondents in the study came from seven countries — the US, Germany, France, the UK, Italy, the Netherlands and Switzerland. It’s not a globe-spanning data-set and it doesn’t include the potential AI leader, China, nor one of the leaders in AI research, Canada. But the study’s scope is respectable. It shows that the US isn’t among the leaders of the AI race, though a 2018 study by Capgemini Consulting, for example, puts it out ahead and Russia far behind.

The problem with this survey — and a similar one by McKinsey — is that when people say they are using AI in their business, they may not all mean the same thing; they may not even be describing uses that fall under the rather broad definition of AI; and they may simply be boasting because the technology is fashionable.

In a new report, “The State of AI: Divergence, 2019,” the UK venture capital fund MMC Ventures claims that “one in seven large companies has adopted AI; in 24 months, two thirds of large companies will have live AI initiatives. In 2019, AI ‘crosses the chasm’ from early adopters to the early majority.”

Read more: https://www.businesslive.co.za/bd/opinion/2019-03-10-business-leaders-love-ai-in-theory-that-is/

24 Jan 2019
A memory pill? Cognitive neuroscience’s contributions to the study of memory

A memory pill? Cognitive neuroscience’s contributions to the study of memory

Hebbian Learning

In 1949, Canadian psychologist Donald Hebb proposed the theory of Hebbian learning to explain how a learning task is transformed into a long-term memory. In this way, healthy habits become automatically retained after their continual repetition.

Learning and memory are a consequence of how our brain cells (neurons) communicate with each other. When we learn, neurons communicate through molecular transmissions which hop across synapses producing a memory circuit. Known as long-term potentiation(LTP), the more often a learning task is repeated, the more often transmission continues and the stronger a memory circuit becomes. It is this unique ability of neurons to create and strengthen synaptic connections by repeated activation that leads to Hebbian learning.

Memory and the hippocampus

Understanding the brain requires investigation through different approaches and from a variety of specialities. The field of cognitive neuroscience initially developed through a small number of pioneers. Their experimental designs and observations led to the foundation for how we understand learning and memory today.

Donald Hebb’s contributions at McGill University remain the driving force to explain memory. Under his supervision, neuropsychologist Brenda Milner studied a patient with impaired memory following a lobectomy. Further studies with neurosurgeon Wilder Penfield enabled Milner to expand her study of memory and learning in patients following brain surgery.

Read more: https://theconversation.com/a-memory-pill-cognitive-neurosciences-contributions-to-the-study-of-memory-109707

25 Jan 2018
Manahel Thabet

We May Have Just Figured Out How the Brain Processes Good and Bad Experiences

IN BRIEF

We associate positive feelings with foods and experiences we enjoy, and negative feelings with the opposite. One new study dives deeper than ever before into why.

HOW TO FEEL

A new study has mapped out in unprecedented detail the “neighborhoods” of the brain that assign good and bad feelings to objects and experiences. Led by MIT neuroscientist Kay Tye, the research is illuminating brain processes that neuroscientists still don’t understand, and could have implications for treating mental health disorders.

In 2016, Tye’s research team found that within the amygdala — the center for emotions in the brain — there are neurons that assign good or bad feelings known as “valence.” These responses are integral to human survival; it is vitally important that we remember what foods or other experiences are good, and what are bad and could sicken or kill us. The new study, published in the journal Cell Reports, more deeply explores the inner workings of valence by focusing on a particular section of the amygdala, the basolateral amygdala.

The team, led by lead author Anna Beyeler, trained mice to associate “good” tasting sucrose drops with a certain audible tone, and bitter quinine drops with a different tone. They later recorded the neural responses of the mice when the different tones were played, to see which valence they were conditioned to express. They then identified and manipulated those neurons identified to play a key role in valence and engineered them to respond to light pulses. This allowed them to record the electrical activity of the neurons and their nearby agents, revealing what influenced local circuits and how.

By looking at these interactions and system structures close up, the team found that within the basolateral amygdala region, there are distinct and diverse “neighborhoods,” in which valence is determined through connections to other regions in the brain and interactions with the basolateral amygdala itself.

UNDERSTANDING VALENCE

At the end of the experiment, the team had mapped over 1,600 neurons. Within these, they highlighted three different types of neurons that project to different parts of the brain and are associated with different types of valence. The team also found that different types of neurons tend to group together in “hotspots.” However, despite these tight groupings, they also noted that different types of neurons often mixed together.

The future prospective applications of this research are undefined as of yet. Yet there are hopes that by understanding how the brain processes good and bad experiences, scientists could better understand certain mental health issues and addiction.Additionally, the researchers found that depending on the type of neuron, they have different abilities to influence one another. Tye stated in a press release that this could be due to the mixing the observed: “Perhaps the intermingling that there is might facilitate the ability of these neurons to influence each other.”

“Perturbations of emotional valence processing is at the core of many mental health disorders,” said Tye in the press release. “Anxiety and addiction, for example, may be an imbalance or a misassignment of positive or negative valence with different stimuli.”

Even beyond this, there is theoretical potential in manipulating feelings or desires through control of these neurons and networks. Researchers have not suggested any plans to use the research this way, but it is certainly not out of the question.

Source: Futurism

23 Jan 2018
Manahel Thabet

Antibiotic Algorithm Will Fast-Track the Search for New Medicine

IN BRIEF

Antibiotic resistance is an issue that affects countless people all around the world, and it’s only getting worse. A newly developed algorithm helps scientists find variants of known antibiotics to support the fight against resistance.

HUNTING FOR ANTIBIOTICS

Antibiotic resistance is a growing issue in which harmful bacteria in the body are no longer receptive to the effects of antibiotics. Because of this issue, more and more patients struggle with everything from common illnesses to much more severe bacterial infections that could cause life-threatening harm.

One technique that could combat antibiotic resistance is finding variants of known antibiotics, or peptidic natural products (PNPs). Unfortunately, finding these variants has been an arduous and time-consuming process. Until now: a group of American and Russian computer scientists has created an antibiotic algorithm that, by rapidly sorting through databases, can discover 10 times more new variants of PNPs than all previous efforts combined.

The algorithm, known as VarQuest, is described in the latest issue of the journal Nature Microbiology. Hosein Mohimani, an assistant professor in Carnegie Mellon University’s Computational Biology Departmen, said in a press release that VarQuest completed a search that could have taken hundreds of years of computations traditionally.

Two packs of white circular pills, and two loose pills, on a rose-pink background. A new antibiotic algorithm will hopefully reduce resistance to traditional drugs like these.
Image Credit: padrinan / pixabay

Mohimani also said that the study expanded their understanding of the microbial world. Not only does finding more variants quickly increase researchers’ ability to formulate alternative antibiotics; it can also provide vital information to microbiologists.

“Our results show that the antibiotics produced by microbes are much more diverse than had been assumed,” said Mohimani in a press release.

Mohimani noted that, because VarQuest was able to find over one thousand variants and in such a short amount of time, it could give microbiologists a larger perspective, perhaps alerting them to trends or patterns that wouldn’t otherwise be noticeable.

FIGHTING RESISTANCE

VarQuest’s success stands on the shoulders of computing progress made within the past few years. High-throughput methods have advanced, allowing samples to be processed in batches instead of one at a time, making the process much faster. Additionally, the effort has been supported by the Global Natural Products Social (GNPS) molecular network, launched in 2016. This is a database in which researchers from around the world collect data on natural products based on their mass spectra, the chemical analysis of how charge is distributed through a substance’s mass. Using this database alongside VarQuest could drastically enhance drug discovery abilities.

“Natural product discovery is turning into a Big Data territory, and the field has to prepare for this transformation in terms of collecting, storing and making sense of Big Data,” Mohimani said of this growing data and scientists’ ability to access it. “VarQuest is the first step toward digesting the Big Data already collected by the community.”

This issue is only going to get worse if steps are not taken to prevent resistance in the first place. However, as solutions are crafted, working antibiotics are still needed for those facing both resistance and infection. This antibiotic algorithm will be a critical tool in mitigating the effects of resistance while also giving microbiologists a big-picture view, hopefully propelling research forward.Every time a person consumes an antibiotic, evolution pushes bacterial species to develop resistance and multiply. Children and elderly adults have the highest rates of antibiotic use, but are also higher-risk groups to begin with, making this a particularly concerning issue with them. Many partially attribute the drastic growth of the resistance problem to the needless prescription of antibiotics to patients with viral illnesses, for whom antibiotics would have no effect but to create resistance.

Source: Futurism

26 Nov 2017

A Cheap, Portable Skin Cancer Detector Has Won the Dyson Award

IN BRIEF

A team of engineering graduates has won the prestigious James Dyson Award for their cheap and portable device that can detect melanoma, a form of skin cancer. The device could potentially save thousands of lives, as skin cancer is the most common type worldwide.

MEET SKAN

Detecting skin cancer early isn’t easy. Currently, it’s done through visual inspections or biopsies, but some doctors may not pick up on the disease using the former, while some patients may not be able to afford the latter. As such, a team of graduates from McMaster University in Canada set out to develop an inexpensive skin cancer detector, and their innovative work has earned them the prestigious international James Dyson Award.

Cancer affects the metabolic rate of skin cells, with cancerous cells heating up faster than their healthy counterparts following a shock of cold temperature.

To make identifying these cells easier, the McMaster University team — Michael Takla, Rotimi Fadiya, Prateek Mathur, and Shivad Bhavsar — built a skin cancer detector with 16 thermistors that can track the rate of temperature increase following a cold shock from an ice pack.

The thermistors are simply placed on the potentially cancerous area of skin, and the device produces a heat map that can be used to determine the presence of melanoma

“By using widely available and inexpensive components, the sKan allows for melanoma skin cancer detection to be readily accessible to the many” award founder James Dyson said in a statement announcing the win. “It’s a very clever device with the potential to save lives around the world.”

In addition to winning the Dyson Award for their skin cancer detector, the team also received a cash prize of approximately $40,000 to advance their research. They received $10,000 at the the Forge’s Student Start-up Pitch competition in March.

DIAGNOSING SKIN CANCER

According to Mathur, the team was inspired to create sKan after realizing technology hadn’t had the same impact on skin cancer diagnosis as it had on other medical fields.

“We found research that used the thermal properties of cancerous skin tissue as a means of detecting melanoma. However, this was done using expensive lab equipment,” he said in a McMaster University news release. “We set out to apply the research and invent a way of performing the same assessment using a more cost-effective solution.”

Going forward, the sKan team hopes to create a more advanced prototype that will allow them to begin pre-clinical testing.

skin cancer detector
Image Credit: James Dyson Awards

As reported by The Guardian, nearly 39 people are diagnosed with skin cancer every day in the U.K., and the American Cancer Society (ACS) estimates 87,110 new cases of melanoma will be diagnosed in the U.S. 2017, with 9,730 people dying from the condition. Early detection is key to cancer survival, so if sKan succeeds, it could significantly reduce that number.

“Our aspirations have become a reality,” said Mathur. “Skin cancers are the most common form of cancer worldwide, and the potential to positively impact the lives of those affected is both humbling and motivating.”

Source: Futurism

25 Nov 2017

A New Debit Card Is Poised to Make Spending Crypto a Breeze

IN BRIEF

To make it easier for people in the United Kingdom to spend their various cryptocurrencies, startup London Block Exchange is launching a new Visa debit card called the Dragoncard. It pays the retailer in pounds, then takes money from the consumer’s crypto wallet.

SPENDABLE CRYPTO

Cryptocurrencies such as ether and bitcoin are surging in popularity thanks to their many benefits over traditional currencies, but they still lag behind those currencies in one key way: they are not easy to spend in physical stores. People can spend USD and euros using a plethora of debit, credit, and gift cards, but their options are severely limited when it comes to spending bitcoin or ether using a cryptocurrency debit card.

That’s starting to change, though. The Centra Card can be used just like a debit card to spend bitcoin, ether, dash, and several other popular cryptocurrencies. Token Card is another cryptocurrency debit card, and soon, London startup London Block Exchange (LBX) will launch a prepaid Visa debit card that will act in the same fashion.

Dragoncard
The Dragoncard from LBX. Image Credit: London Block Exchange

The Dragoncard will allow people to convert their bitcoin, ether, ripple, litecoin, and monero to sterling (aka the British pound) at the time of purchase, thereby making it significantly easier for those currencies to be spent in stores throughout the United Kingdom, including ones that have yet to accept alternative forms of payment.

Business Insider reports the cryptocurrency debit card will be issued by pre-paid card provider Wavecrest, and it comes alongside an app people can use to buy and manage cryptocurrencies on LBX’s own exchange. When someone uses the Dragoncard, LBX will pay the retailer in pounds first, then take the equivalent amount from the shopper’s cryptocurrency wallet.

Before rushing off to get a Dragoncard when it debuts in December, though, interested crypto owners should know a few things. First, the card itself is £20 ($26.33). Second, they will be charged a 0.5 percent fee whenever they buy or sell cryptocurrencies on LBX’s platform. Lastly, provider Wavecrest charges a small fee for ATM withdrawals — it is a debit card, after all.

THE PATH TO ACCEPTANCE

Despite the fees, the Dragoncard and other cryptocurrency debit cards have the potential to help crypto become widely accepted and, more importantly, understood.

The Dragoncard also arrives at a time when bitcoin is experiencing quite a growth spurt. With schools, companies, and even nations starting to embrace bitcoin, the currency is poised to continue increasing in value and popularity, and with the Dragoncard, LBX is hoping to help Londoners join that ever-growing segment of crypto supporters.

“Despite being the financial capital of the world, London is a difficult place for investors to enter and trade in the cryptocurrency market,” LBX founder and CEO Benjamin Dives reportedly said in a statement. “We’ll bring it into the mainstream by removing the barriers to access, and by helping people understand and have confidence in what we believe is the future of money.”

“We’re offering a grown up and robust experience for those who wish to safely and easily understand and invest in digital currencies,” said LBX’s executive chairman Adam Bryant. “We’re confident we’ll transform this market in the U.K. and will become the leading cryptocurrency and blockchain consultancy for institutional investors and consumers alike.”

Disclosure: Several members of the Futurism team, including the editors of this piece, are personal investors in a number of cryptocurrency markets. Their personal investment perspectives have no impact on editorial content.

Source: Futurism

22 Nov 2017

Measurements From CERN Suggest the Possibility of a New Physics

A New Quantum Physics?

During the mid- to late-twentieth century, quantum physicists picked apart the unified theory of physics that Einstein’s theory of relativity offered. The physics of the large was governed by gravity, but only quantum physics could describe observations of the small. Since then, a theoretical tug-o-war between gravity and the other three fundamental forces has continued as physicists try to extend gravity or quantum physics to subsume the other as more fundamental.

Recent measurements from the Large Hadron Collider show a discrepancy with Standard Model predictions that may hint at entirely new realms of the universe underlying what’s described by quantum physics. Although repeated tests are required to confirm these anomalies, a confirmation would signify a turning point in our most fundamental description of particle physics to date.

Quantum physicists found in a recent study that mesons don’t decay into kaon and muon particles often enough, according to the Standard Model predictions of frequency. The authors agree that enhancing the power of the Large Hadron Collider (LHC) will reveal a new kind of particle responsible for this discrepancy. Although errors in data or theory may have caused the discrepancy, instead of a new particle, an improved LHC would prove a boon for several projects on the cutting edge of physics.

The Standard Model

The Standard Model is a well-established fundamental theory of quantum physics that describes three of the four fundamental forces believed to govern our physical reality. Quantum particles occur in two basic types, quarks and leptons. Quarks bind together in different combinations to build particles like protons and neutrons. We’re familiar with protons, neutrons, and electrons because they’re the building blocks of atoms.

The “lepton family” features heavier versions of the electron — like the muon — and the quarks can coalesce into hundreds of other composite particles. Two of these, the Bottom and Kaon mesons, were culprits in this quantum mystery. The Bottom meson (B) decays to a Kaon meson (K) accompanied by a muon (mu-) and anti-muon (mu ) particle.

The Anomaly

They found a 2.5 sigma variance, or 1 in 80 probability, “which means that, in the absence of unexpected effects, i.e. new physics, a distribution more deviant than observed would be produced about 1.25 percent of the time,” Professor Spencer Klein, senior scientist at Lawrence Berkeley National Laboratory, told Futurism. Klein was not involved in the study.

This means the frequency of mesons decaying into strange quarks during the LHC proton-collision tests fell a little below the expected frequency. “The tension here is that, with a 2.5 sigma [or standard deviation from the normal decay rate], either the data is off by a little bit, the theory is off by a little bit, or it’s a hint of something beyond the standard model,” Klein said. “I would say, naïvely, one of the first two is correct.”

To Klein, this variance is inevitable considering the high volume of data run by computers for LHC operations. “With Petabyte-(1015 bytes)-sized datasets from the LHC, and with modern computers, we can make a very large number of measurements of different quantities,” Klein said. “The LHC has produced many hundreds of results. Statistically, some of them are expected to show 2.5 sigma fluctuations.” Klein noted that particle physicists usually wait for a 5-sigma fluctuation before crying wolf — corresponding to roughly a 1-in-3.5-million fluctuation in data.

 

Physics is studied here at CERN: CMS.

These latest anomalous observations do not exist in a vacuum. “The interesting aspect of the two taken in combination is how aligned they are with other anomalous measurements of processes involving B mesons that had been made in previous years,” Dr. Tevong You, co-author of the study and junior research fellow in theoretical physics at Gonville and Caius College, University of Cambridge, told Futurism. “These independent measurements were less clean but more significant. Altogether, the chance of measuring these different things and having them all deviate from the Standard Model in a consistent way is closer to 1 in 16000 probability, or 4 sigma,” Tevong said.

Extending the Standard Model

Barring statistical or theoretical errors, Tevong suspects that the anomalies mask the presence of entirely new particles, called leptoquarks or Z prime particles. Inside bottom mesons, quantum excitations of new particles could be interfering with normal decay frequency. In the study, researchers conclude that an upgraded LHC could confirm the existence of new particles, making a major update to the Standard Model in the process.

“It would be revolutionary for our fundamental understanding of the universe,” said Tevong. “For particle physics […] it would mean that we are peeling back another layer of Nature and continuing on a journey of discovering the most elementary building blocks. This would have implications for cosmology, since it relies on our fundamental theories for understanding the early universe,” he added. “The interplay between cosmology and particle physics has been very fruitful in the past. As for dark matter, if it emerges from the same new physics sector in which the Zprime or leptoquark is embedded, then we may also find signs of it when we explore this new sector.”

The Power to Know

So far, scientists at the LHC have only observed ghosts and anomalies hinting at particles that exist at higher energy levels. To prove their existence, physicists “need to confirm the indirect signs […], and that means being patient while the LHCb experiment gathers more data on B decays to make a more precise measurement,” Tevong said. “We will also get an independent confirmation by another experiment, Belle II, that should be coming online in the next few years. After all that, if the measurement of B decays still disagrees with the predictions of the Standard Model, then we can be confident that something beyond the Standard Model must be responsible, and that would point towards leptoquarks or Zprime particles as the explanation,” he added.

To establish their existence, physicists would then aim to produce the particles in colliders the same way Bottom mesons or Higgs bosons are produced, and watch them decay. “We need to be able to see a leptoquark or Zprime pop out of LHC collisions,” Tevong said. “The fact that we haven’t seen any such exotic particles at the LHC (so far) means that they may be too heavy, and more energy will be required to produce them. That is what we estimated in our paper: the feasibility of directly discovering leptoquarks or Zprime particles at future colliders with higher energy.”

Quantum Leap for the LHC

Seeking out new particles in the LHC isn’t a waiting game. The likelihood of observing new phenomena is directly proportional to how many new particles pop up in collisions. “The more the particle appears the higher the chances of spotting it amongst many other background events taking place during those collisions,” Tevong explained. For the purposes of finding new particles, he likens it to searching for a needle in a haystack; it’s easier to find a needle if the haystack is filled with them, as opposed to one. “The rate of production depends on the particle’s mass and couplings: heavier particles require more energy to produce,” he said.

Electricity is studied at a physics research lab.

This is why Tevong and co-authors B.C. Allanach and Ben Gripaios recommend either extending the LHC loop’s length, thus reducing the amount of magnetic power needed to accelerate particles, or replacing the current magnets with stronger ones.

According to Tevong, the CERN laboratory is slated to keep running the LHC in present configuration until mid-2030s. Afterwards, they might upgrade the LHC’s magnets, roughly doubling its strength. In addition to souped-up magnets, the tunnel could see an enlargement from present 27 to 100 km (17 to 62 miles). “The combined effect […] would give about seven times more energy than the LHC,” Tevong said. “The timescale for completion would be at least in the 2040s, though it is still too early to make any meaningful projections.”

If the leptoquark or Z prime anomalies are confirmed, the Standard Model has to change, Tevong reiterates. “It is very likely that it has to change at energy scales directly accessible to the next generation of colliders, which would guarantee us answers,” he added. While noting that there’s no telling if dark matter has anything to do with the physics behind Zprimes or leptoquarks, the best we can do is seek “as many anomalous measurements as possible, whether at colliders, smaller particle physics experiments, dark matter searches, or cosmological and astrophysical observations,” he said. “Then the dream is that we may be able to form connections between various anomalies that can be linked by a single, elegant theory.”

Source: Futurism