Month: May 2019

30 May 2019
AI Could Be Better For The Workplace Than We Think, But We Still Need To Be Careful

AI Could Be Better For The Workplace Than We Think, But We Still Need To Be Careful

It is amazing what difference a few months can make. Not so long ago, there was a widely-held view that the workplace as we know it was about to face fundamental change, with a whole range of jobs currently filled by people with varying levels of qualifications being replaced by machines. Some experts suggested that as many as half of jobs could be in the firing line. Not for nothing was there a surge of interest in the idea of a Universal Basic Income, or something similar, as a response to the idea that great swathes of the population would have nothing to do and, more important, nothing to live on. In recent months, however, Artificial Intelligence and machine learning, the technologies that enable this latest form of automation, have attracted rather more positive opinions.  Even the authors of one of the studies that — they say unintentionally — helped kick-start the panic over robots taking over jobs have adopted a more nuanced approach. 

However, a generally upbeat message is offered by a study carried out by the CIPD, the U.K.’s professional body for HR and people development, and PA Consulting, the global transformation and innovation consultancy. The report, People and Machines: From Hype to Reality, suggests that, rather than “an inevitable march of the robots” leaving poor quality jobs and mass unemployment, the new technologies could create “a positive human future and social gains through higher skilled jobs, flexible ways of working and creating environments where people’s ingenuity can flourish.”

However, as Katharine Henley, workforce transformation expert at PA, makes clear, this happy outcome is not inevitable. It requires employers and business leaders to consider the people perspective and integrate technology plans into a well-developed people strategy. In particular, HR departments need to take a more central role than is largely the case at present. “They need to constantly show their value,” she says, adding that — by encouraging management teams to think more strategically about the issue — they can look at the “human element” and improve the employee experience. Indeed, one of the more surprising findings in the survey was that employees felt the introduction of AI might improve their wellbeing through providing opportunities to learn more things, requiring them to do fewer monotonous tasks and giving them more control over their work.

The report stresses that some types of jobs are obviously more likely to be affected by AI and automation more than others. Among those deemed to be less likely to suffer an impact are sales and service workers — “perhaps because they are strongly customer-facing”. However, Sanjeev Sularia, chief executive and co-founder of the retail intelligence business Intelligence Node, argues that retail is being transformed by the new technology.  Sularia, a member of the Forbes Technology Council, says that, thanks to AI, his company is able to help retailers adjust prices and inventory in real time and so make them more competitive and responsive to customers. He still sees a role for the sales person, but insists that they will have to be much more proactive than is generally the case at present. In addition to being much more attentive and able to anticipate what customers want, they will need to be prepared to use the technology in order to, for instance, sell related goods or services at the till.

Source:
https://www.forbes.com/sites/rogertrapp/2019/05/30/ai-could-be-better-for-the-workplace-than-we-think-but-we-still-need-to-be-careful/#6d509a01622f

29 May 2019
Artificial Intelligence In The Workplace: How AI Is Transforming Your Employee Experience

Artificial Intelligence In The Workplace: How AI Is Transforming Your Employee Experience

Artificial intelligence (AI) is quickly changing just about every aspect of how we live our lives, and our working lives certainly aren’t exempt from this.

Soon, even those of us who don’t happen to work for technology companies (although as every company moves towards becoming a tech company, that will be increasingly few of us) will find AI-enabled machines increasingly present as we go about our day-to-day activities.

From how we are recruited and on-boarded to how we go about on-the-job training, personal development and eventually passing on our skills and experience to those who follow in our footsteps, AI technology will play an increasingly prominent role.

Here’s an overview of some of the recent advances made in businesses that are currently on the cutting-edge of the AI revolution, and are likely to be increasingly adopted by others seeking to capitalize on the arrival of smart machines.

AI pre-screening of candidates before inviting the most suitable in for interviews is an increasingly common practice at large companies that make thousands of hires each year, and sometimes attract millions of applicants.

Pymetrics is one provider of tools for this purpose. It enables candidates to sit a video interview in their own home, sometimes on a different continent to the organization they are applying to work for. After answering a series of standard questions, their answers – as well as their facial expressions and body language – are analyzed to determine whether or not they would be a good fit for the role.

Another company providing these services is Montage, which claims that 100 of the Fortune 500 companies have used their AI-driven interviewing tool to identify talent before issuing invitations for interviews.

When it comes to onboarding, AI-enabled chatbots are the current tool of choice, for helping new hires settle into their roles and get to grips with the various facets of the organizations they’ve joined.

Multinational consumer goods manufacturer Unilever uses a chatbot called Unabot, that employs natural language processing (NLP) to answer employees’ questions in plain, human language. Advice is available on everything from where they can catch a shuttle bus to the office in the morning, to how to deal with HR and payroll issues.

Recruitment and onboarding

Before we even set foot in a new workplace, it could soon be a fact that AI-enabled machines have played their part in ensuring we’re the right person for the job.

Read more:
https://www.forbes.com/sites/bernardmarr/2019/05/29/artificial-intelligence-in-the-workplace-how-ai-is-transforming-your-employee-experience/#ba660f553cec

28 May 2019
Keep Your A.I. Buzzwords Straight

Keep Your A.I. Buzzwords Straight

Artificial intelligence is having its moment. Business leaders can’t stop talking about it. New tech products invariably include it. And news headlines incessantly chronicle the buzz around it. But for many people, artificial intelligence remains a mystery.

To help, we’ve created a guide that explains some of the key terms associated with the technology, an increasingly useful tool for businesses that improves as it crunches more data.

Reinforcement Learning

This A.I. technique is like training a dog with treats. The software learns by successfully executing a task and, on the flip side, from failure. This fusion of reinforcement learning with deep learning has led to tremendous breakthroughs, like computers beating humans at complicated video and board games. ­Example: Facebook’s ­targeted notifications.

Neural Networks

A.I.’s rise can be traced to software developed decades ago that was intended to approximate how the human brain learns. Inside a neural network are layers of interconnected nodes where calculations take place that help computers sift though data in minute detail. By doing so, the software can learn to recognize patterns that even the most intelligent humans may overlook. Example: Baidu search.

Deep Learning

Mixing neural networks with machine learning makes for deep learning, a powerful technology that can crunch enormous amounts of data, like vast archives of audio clips. A.I.’s biggest breakthroughs—such as recognizing snow leopards in photos—can be traced to the technology. ­Example: Nvidia’s 3D A.I.-generated faces.

Machine Learning

You can thank machine learning for recommending how to respond to your boss when she emails asking whether an important document is in order (“Looks good to me”) or whether you can meet at noon (“Let’s do it!”). This is just a taste of how algorithms help computers “learn.” The chief attraction: Companies don’t need humans to program the technology for each specific task it handles. Example: Google Gmail.

Computer Vision

Devices using computer vision are able to see and understand their surroundings almost like a human. Think of facial-recognition technology that can automatically unlock your iPhone or the systems that help navigate self-driving cars without crashing them into trees. The problem seems easy to solve. But in reality, it’s very ­difficult. Example: Waymo’s ­autonomous vehicles.

Natural Language Processing

This technology makes it possible for ­computers to understand and react to human speech and language. Voice-controlled digital ­assistants, which take dictation or power Internet-­connected home speakers, would be impossible without it. The technology is still imperfect, but it’s improving quickly. Example: Amazon Alexa digital assistant.

Source:
http://fortune.com/2019/05/28/ai-buzzwords/

27 May 2019

London leads Siemens Atlas of Digitalization as most digitally ready global city

Siemens has launched a new web-based application which reveals the readiness and potential of six major cities to embrace digitalization and develop new ways of living, working and interacting. The Atlas of Digitalization is based around the interconnected themes of Expo 2020 Dubai – Mobility, Sustainability and Opportunity – and assesses how the fourth industrial revolution has already impacted urban life around the world, and the potential it could have in the future.

Data from 21 indicators has been analyzed from Buenos Aires, Dubai, Johannesburg, London, Los Angeles, and Taipei. From this analysis a Digital Readiness Score has been defined, considering areas such as smart electricity and transport systems, internet connections and digital governance services. The score reveals the current level of maturity of each city’s digital infrastructure, and its preparedness for a connected future.

Juergen Maier, CEO Siemens UK said: “It is tremendous that London is leading the charge in digitalization among these global cities. In spite of all the economic uncertainty we have been facing in the UK over the last two years this study shows we are still well placed to achieve leadership globally in the fourth industrial revolution if we continue to invest, innovate and grow responsibly and sustainably.”

“However there is more to the UK economy than London and our Northern cities particularly in the Northern Powerhouse must also benefit from innovation and investment. Each city here in the UK and globally must address its own unique mix of challenges and opportunities by embracing digitalization; the key to sustainable, economically vibrant future cities.”

“The Atlas of Digitalization gives us an insight to the current status of digitalization in global cities, and the data tells us London has already made excellent progress. We hope the Atlas will inspire new ways of thinking to shape all the smart cities of tomorrow and realize the global potential of City 4.0,” added Maier.

Innovation and the environment

The analysis takes into account areas such as innovation, greenhouse gas emissions and time spent in traffic to give the cities a Digital Potential Score, indicating where there is opportunity to grow digital capabilities to transform society and economy. Together, the Readiness and Potential scores illustrate the different capacities each city already has, and where they can develop to effect change and growth.

Visitors to the Atlas of Digitalization can interact with each city and explore its data

The Atlas recognizes London’s advanced implementation of digital technologies in areas such as the introduction of the congestion charge and the Ultra Low Emission Zone which will dramatically cut nitrogen oxide (NOx) emissions. Currently road traffic in London is responsible for more than half of the city’s NOx emissions.

The Atlas also identifies potential for digitalization to positively impact areas in London such as improving mobile internet speeds and opening up new opportunities for internet-enabled services based on the Internet of Things such as ‘vehicle to everything communications’ to improve efficiency.

Mark Jenkinson, City Director, London, at Siemens said: “As part of the Mayor of London’s ‘Smarter London Together’ plan to transform London into the smartest city in the world, the Greater London Authority has set out how they want to collaborate with the capital’s boroughs and services, from Transport for London to the National Health Service: working more effectively with the tech community, universities and other cities to make their vision a reality.”

Data from the 21 indicators has been mapped across three themes: Sustainability, Mobility and Opportunity, creating unique visualization of each city’s digitalization landscape. Visitors to the Atlas of Digitalization can interact with each city and explore its data, taking an in-depth look into how each is addressing its own challenges and opportunities, and how it will impact work and life in future cities.

The Sustainability, Mobility and Opportunity themes are identified by Expo 2020 Dubai as having the power to build partnerships, inspire progress and develop our future cities. Siemens is Infrastructure Digitalization Partner to Expo 2020 Dubai and will use digital solutions to connect, monitor and control buildings at the site using MindSphere, the cloud-based operating systems for the Internet of Things.

Source:
https://workplaceinsight.net/london-leads-siemens-atlas-of-digitalization-as-most-digitally-ready-global-city/

26 May 2019

What makes someone a great leader in the digital economy?

What will great leadership look like in five years? What about in 10? Douglas Ready, a senior lecturer in organizational effectiveness at MIT Sloan and an expert on executive development, has lately been considering these questions as part of a Big Ideas research initiative with MIT Sloan Management Reviewand Cognizant. Ready has been thinking, too, about why they matter: we are becoming an ever more digital economy, and leadership must adapt.

Percentage of managers who strongly agree their leaders have the skills to transition to the digital economy.

Ready asserts that a handful of leadership characteristics will endure no matter what. Integrity comes to mind, as do courage and the ability to execute. But other contextual characteristics, as he describes them, must be responsive to the evolving world of business.

“So, whereas crafting a vision and a strategy is an enduring leadership characteristic, doing so in a transparent, inclusive, and collaborative manner is a contextual characteristic, given the expectations of the new workforce,” Ready writes in a recent article in MIT Sloan Management Review. “Great leaders will need to more artfully merge the ‘what’ with the ‘how’ to thrive in tomorrow’s world.”

“Leading Into the Future” is the first in a yearlong exploration of the future of leadership in the digital economy. The research team is tackling a broad range of subjects related to this issue through a global survey and in-depth executive interviews with those most heavily involved in digital transformation. Below are three insights offered from the series so far.

Mind the mindset gap

In partnership with MIT Sloan Management Reviewand Cognizant, Ready surveyed more than 4,000 managers and leaders from over 120 countries on their preparedness for the transition to a digital economy. Only 12% of respondents strongly agreed that their organizations’ leaders had the right mindset and 9% strongly agreed that their leaders had the proper skills to lead in the digital economy.

To Ready, this “mindset gap” is more concerning than the skills deficit. “We can train for the digital skills that are important for future success,” he writes. “But developing a digital mindset is a more complex challenge because it is a less tangible one to address.” And as long as the mindset gap exists, so do critical blind spots about how the digital economy is eroding old ways of doing business.

Read more:
https://mitsloan.mit.edu/ideas-made-to-matter/what-makes-someone-a-great-leader-digital-economy

25 May 2019
Economist urges Government to manage disruptive technologies

Economist urges Government to manage disruptive technologies

If Government is to bring down the country’s soaring debt through growth, an American economist is strongly advising the Mia Mottley administration to better handle disruptive technologies.

But while many pundits advocate a complete embrace of this technology, Michigan University economics professor Dr Linda Tesar is warning Government to expect significant short-term pain in order to gain potential benefits in the long run.

Disruptive technology has significantly altered the way businesses or entire industries have traditionally functioned.

In the most recent cases, much of it driven by e-commerce, businesses have been forced to change the way they approach their operations for fear of losing market share or becoming irrelevant.

With Amazon up-ending bricks-and-mortar retailers, Uber ride-sharing changing the face of public transport and Airbnb’s online hospitality marketplace disrupting the hotel and tourism trade.

Tesar, who was the featured speaker as the Central Bank of Barbados’ 6th Distinguished Visiting Fellow, said: “The thing about disruptive technology such as Airbnb and Uber, it is great, but it is disruptive for a reason because it disrupts what is already there. This means that the only way to take advantage of disruptive technology is to be willing to upset existing businesses.

“If you bring in Uber, the taxi drivers aren’t going to be very happy. For example, when driverless truck technology comes on stream, it is going to mean layoffs for many drivers.

“What are these drivers going to do? Re-training and re-tooling are all easy things to say but not so easy in practice.”

Dr. Tesar explained that because disruptive technologies are not contained by conventional rules, it is difficult to adequately plan for it in a growth strategy.

She said: “We don’t know what it is because if we knew, we would put a label on it. So, it is very hard to create the conditions for it to grow.

“I think it is tempting to say that growth is the way out, but I think it is dangerous to say that one is going to grow themselves out of debt. While it is probably good in the long run, in the short term it is very painful.”

The economist suggested that with a high debt to GDP, Barbados only has three options if Government is to attain the goal of bringing the debt to 60 per cent of GDP by 2033: taxation, cuts in spending and growth.

She said that of the three, growth is the most desirous option, but noted that in the quest for quick growth, unmanaged disruptive technologies become a concern.

Dr Tesar said: “In bringing down the debt there is only three things to do, spend less, tax more or growth. All three of those things are going to contribute to primary surplus, so that you can have a sustainable level of debt.

“Out of those, if one had to pick one, growth would be the one that they choose but getting growth is never that simple.

“How do we grow our way out of debt? One way is to create a climate where businesses can say this is a place where we want to invest.  Another is increasing efficiency by getting more out of what you are currently doing.

“Finally, there is innovation, which sometimes shows up as a technology factor in the production function. It ends up being the residual that we can’t explain.”

Source:
https://barbadostoday.bb/2019/05/23/economist-urges-government-to-manage-disruptive-technologies/

23 May 2019
The future of AI is collaborative

The future of AI is collaborative

AI is becoming increasingly widespread, affecting all facets of society — even Sonic drive-ins are planning to implement artificial intelligence to provide better customer service.

Of course, every time a new innovation appears in the realm of AI, fears arise regarding its potential to replace human jobs. While this is a reality of adapting to a more tech-driven society, these fears tend to ignore the collaborative and job-creating attributes that AI will have in the future.

The future’s most successful businesses will be those that learn to combine the best attributes of machines and human workers to achieve new levels of efficiency and innovation. In reality, the future of AI will be largely dependent on collaboration with living, breathing human beings.

AI augmenting human performance

In most business settings, AI does not have the ability to make crucial decisions. However, it does have the power to provide greater insights and support to ensure that you make the right decisions faster.

Simply put, there are many tasks that AI can perform faster and more efficiently than humans. It is estimated that we produce 2.5 quintillion bytes of data per day. While individual businesses only produce a tiny fraction of that total, there is no denying that trying to analyze data points drawn from diverse areas such as logistics, marketing and corporate software programs is becoming increasingly difficult.

This is where AI enters the picture. Machine learning allows AI to analyze data points at much greater speed than a person ever could, while also eliminating the risk of data entry errors that so often occur during manual work.

Such systems present data in comprehensive formats that make it far easier to identify trends, opportunities and risks to improve business practices. This trend is already having a significant impact in the business world. A 2016 survey revealed that “61 percent of those who have an innovation strategy say they are using AI to identify opportunities in data that would otherwise be missed.”

While AI may not be granted decision-making capabilities for crucial business tasks, its ability to provide reliable, error-free data is already leading to vital insights that completely transform business operations.

AI’s automation capabilities means it is increasingly being used to streamline mundane tasks and give workers more time for high-level activities. This can make companies more efficient by lowering operating costs and improving productivity. In other words, as AI continues to advance, it will help us do our own jobs even better.

However, the biggest potential for AI comes from machine learning.

As AI learns from new data inputs, it becomes increasingly powerful and better able to assist with more complex tasks and algorithms, further expanding opportunities for collaboration and increased efficiency. Machine learning is helping AI applications better understand a wider range of instructions, and even the context in which a request is made.

This will lead to even faster and more efficient results, and helping to overcome common problems we see today, such as automated customer service systems being unable to solve complaints or requests. Even as these systems grow more advanced, however, there will still be many instanceswhere human interaction is needed to achieve the desired resolution.

People will help machines, too

The future doesn’t merely entail AI streamlining everyday tasks or helping us do our jobs better. AI is only possible thanks to human ingenuity, and that trend isn’t going away anytime soon. Future innovations and improvements will be largely dependent on what people are able to produce.

As Russell Glenister explains in an interview with Business News Daily, “Driverless cars are only a reality because of access to training data and fast GPUs, which are both key enablers. To train driverless cars, an enormous amount of accurate data is required, and speed is key to undertake the training. Five years ago, the processors were too slow, but the introduction of GPUs made it all possible.”

Improving GPUs aren’t the only way developers will continue to play a vital role in helping AI advance to new heights. Human guidance will also be necessary to help AI “learn” how to perform desired tasks — particularly for applications where real-time human interaction will be required.

This is especially apparent in virtual assistants such as Alexa or Siri. Alexa’s recent introduction of speech normalization AI has been found to reduce errors by 81 percent, but these results were only achieved after researchers provided training using a public data set containing 500,000 samples. Similar processes have also been used to give these virtual assistants their own distinct personalities.

As AI applications become more complex and more engrained in day to day life, there will also be an increased need for individuals who can explain the findings and decisions generated by a machine.

Supervision of AI applications will also be necessary to ensure that unwanted outcomes — such as discrimination and even racism — are detected and eliminated to prevent harm. No matter how smart AI becomes, it will continue to require human guidance to find new solutions and better fulfill its intended function.

Though AI offers boundless opportunities for innovation and improvement, it won’t be able to achieve its full potential on its own. A collaborative future will see programmers, engineers and everyday consumers and workers more fully integrating AI into their daily lives.

When people and AI work together, the possibilities will be truly limitless.

Source:
https://thenextweb.com/podium/2019/05/21/the-future-of-ai-is-collaborative/

22 May 2019
In the 'post-digital' era, disruptive technologies are must-haves for survival

In the ‘post-digital’ era, disruptive technologies are must-haves for survival

Call it the post-digital era. That’s the era that many organizations have now entered – the era in which advanced digital technologies are must-haves in order to stay competitive in their markets. At least, that is the way that Accenture sees it.

The research and consulting firm has recently published its annual Technology Vision report, looking at where organizations have been putting their technology investments, and which tools and trends they see as top priorities in the next year.

Information Management spoke with Michael Biltz, who heads the research for the annual study, about what this year’s study revealed and what lessons it holds for software professionals and data scientists.


Information Management: In your recent Technology Vision report, what are the most significant findings of interest to data scientists and data analysts?

Michael Biltz: The overarching takeaway from the 2019 Accenture Technology Vision report is that we’re entering a “post-digital” era in which digital technologies are no longer a competitive advantage – they’re the price of admission. This is supported by our research, which found that over 90 percent of companies invested in digital transformation in 2018, with collective spend reaching approximately $1.1 trillion. This indicates that we’re now at a point where practically every company is driving its business with digital capabilities – and at a faster rate than most people have anticipated.

This new environment necessitates new rules for business; what got your company to where it is today will not be enough to succeed in the new post-digital era.

For example, technology is creating a world of intensely customized and on-demand experiences, where every moment will become a potential market – what we call “momentary markets.” Already, 85 percent of those surveyed believe that customer demands are moving their organizations toward individualized and on-demand delivery models, and that the integration of these two capabilities represents the next big wave of competitive advantage.

In other words, success will be judged by companies’ ability to combine deep understanding of its customers with individualized services delivered at just the right moment.


IM: How do those findings compare with the results of similar previous studies by Accenture?

Biltz: One of the great things about publishing this report annually is that we can observe how the trends evolve year-over-year, with the latest report drawing on insights from earlier editions. The 2019 report builds on last year’s theme of ‘Intelligent Enterprise Unleashed: Redefine Your Company Based on the Company You Keep,’ which focused on how rapid advancements in technologies—including artificial intelligence (AI), advanced analytics and the cloud—are accelerating the creation of intelligent enterprises and enabling companies to integrate themselves into people’s lives, changing the way people work and live.

Expanding on last year’s theme, the 2019 report discusses how the digital enterprise is at a turning point, with businesses progressing on their digital journeys. But digital is no longer a differentiating advantage—it’s now the price of admission. In this emerging “post-digital” world, in which new technologies are rapidly evolving people’s expectations and behavior, success will be based on an organization’s ability to deliver personalized “realities” for customers, employees and business partners. This will require understanding people at a holistic level and recognizing that their outlooks and needs change at a moment’s notice.

IM: Were you surprised by any of these findings, and why so or why not?

Biltz: We were surprised to see that on the one hand, companies are investing time and money into transforming their services and job functions, yet commitment to transforming and reskilling their workforces largely hasn’t kept pace. And when new roles and capabilities are created, many organizations still try to apply traditional (but increasingly outdated) tools, organization structures, training and incentives to support them. This is creating what we call a “Digital Divide,” between companies and their employees.


IM: What are the data-related themes and technologies that will be of greatest interest to organizations over the next three years?

Biltz: While all of the themes described in my response to the first question are relevant to forward-looking organizations, the first trend, ‘DARQ Power: Understanding the DNA of DARQ,’ highlights four emerging technologies that companies should explore in order to remain competitive. These are distributed ledger technology (DLT), artificial intelligence (AI), extended reality (XR) and quantum computing (Q).

While these technologies are at various phases of maturity and application, each represents opportunities for businesses to stay ahead of the curve, differentiate themselves and vastly improve products and services.


IM: Did your study shed any light on how prepared organizations are to adopt these technologies and get expected value from them?

Biltz: Yes, our research into the development and application of DARQ technologies was quite revealing. Our research found that 89 percent of businesses are already experimenting with one or more of these technologies, expecting them to be key differentiators. However, the rate of adoption varies between the four technologies, as they’re currently at varying stages of maturity.

Here are a few specifics for each of DARQ capability: 

  • Distributed Ledger Technologies: 65 percent of executives surveyed reported that their companies are currently piloting or have adopted distributed ledger technologies into their business; 23 percent are planning to pilot this kind of technology in the future.
  • Artificial Intelligence: When asked to rank which of the DARQ technologies will have the greatest impact on their organization over the next three years, 41 percent listed AI as number one. Already, 67 percent are piloting AI solutions, or have already adopted AI across one or more business units.
  • Extended Reality: 62 percent are leveraging XR technologies, and this percentage is set to increase, with 24 percent evaluating how to use XR in the future.
  • Quantum Computing: Although quantum computing is the furthest of the DARQ technologies from full maturity, we’re seeing rapid advances in this area. Consider this: it took 19 years to get from a chip with two qubits (the quantum equivalent of a traditional computer’s bit) to one with 17; within two years of that Google unveiled a 72-qubit chip. And the technology is becoming more readily available, with software companies releasing platforms that allow organizations that don’t have their own quantum computers to use the former’s quantum computers via the cloud.


IM: What is your advice on how IT leaders can best educate the C-suite on which so-called disruptive technologies are worth investing in and which aren’t a good match?

Biltz: It’s important to first focus on your long-term vision and strategy for the company, asking questions such as, “Who do we as a company want to be in 5-10 years? What markets will we target? And what role do we want to play in the emerging digital ecosystems?”

Once you understand the answers to these questions, not only do the specific technologies fall into place, they also tend drive a level of innovation that are usually only expected from likes of the technology giants.


IM: What industries or types of organizations are the leaders with cutting-edge and disruptive technologies that other organizations can learn from?

Biltz: I think organizations can best learn from companies – regardless of industry – that are exploring leveraging more than one DARQ capability to unlock value. This is where true disruption lies: those exploring how to integrate these seemingly standalone technologies together will be better positioned to reimagine their organizations and set new standards for differentiation within their industries.

Volkswagen is an excellent example. The company is using quantum computing to test traffic flow optimization, as well as to simulate the chemical structure of batteries to accelerate development. To further bolster the results from quantum computing, the company is teaming up with Nvidia to add AI capabilities to future models.

Volkswagen is also testing distributed ledgers to protect cars from hackers, facilitate automatic payments at gas stations, create tamper-proof odometers, and more. And the company is using augmented reality to provide step-by-step instructions to help its employes service cars.

Read more:
https://www.dig-in.com/news/in-the-post-digital-era-disruptive-technologies-are-must-haves-for-survival

21 May 2019
Artificial intelligence better than humans at spotting lung cancer

Artificial intelligence better than humans at spotting lung cancer

Researchers have used a deep-learning algorithm to detect lung cancer accurately from computed tomography scans. The results of the study indicate that artificial intelligence can outperform human evaluation of these scans.

doctor looking at lung scans on computer screen

New research suggests that a computer algorithm may be better than radiologists at detecting lung cancer.

Lung cancer causes almost 160,000 deaths in the United States, according to the most recent estimates. The condition is the leading cause of cancer-related death in the U.S., and early detection is crucial for both stopping the spread of tumors and improving patient outcomes.

As an alternative to chest X-rays, healthcare professionals have recently been using computed tomography (CT) scans to screen for lung cancer.

In fact, some scientists argue that CT scans are superior to X-rays for lung cancer detection, and research has shown that low-dose CT (LDCT) in particular has reduced lung cancer deaths by 20%.

However, a high rate of false positives and false negatives still riddles the LDCT procedure. These errors typically delay the diagnosis of lung cancer until the disease has reached an advanced stage when it becomes too difficult to treat.

New research may safeguard against these errors. A group of scientists has used artificial intelligence (AI) techniques to detect lung tumors in LDCT scans.

Daniel Tse, from the Google Health Research group in Mountain View, CA, is the corresponding author of the study, the findings of which appear in the journal Nature Medicine.

‘Model outperformed all six radiologists’

Tse and colleagues applied a form of AI called deep learning to 42,290 LDCT scans, which they accessed from the Northwestern Electronic Data Warehouse and other data sources belonging to the Northwestern Medicine hospitals in Chicago, IL.

The deep-learning algorithm enables computers to learn by example. In this case, the researchers trained the system using a primary LDCT scan together with an earlier LDCT scan, if it was available.

Prior LDCT scans are useful because they can reveal an abnormal growth rate of lung nodules, thus indicating malignancy.

Read more:
https://www.medicalnewstoday.com/articles/325223.php

20 May 2019

New Photonic Microchip Mimics Basic Brain Function

Although the bleeding edge of artificial intelligence has provided us with powerful tools that can outmatch us in specific tasks and best us in even our most challenging games, they all operate as isolated algorithms with none of the incredible associative power of the human brain. Our current computational architecture can’t match the efficiency of our own minds, but a composite team of researchers from the Universities of Münster, Oxford, and Exeter discovered a way to begin narrowing that gap by creating a small artificial neurosynaptic network powered by light.

Today’s computers store memory separately from the processor. The human brain, on the other hand, stores memory in the synapses that serve as the connective tissue between neurons. Rather than transferring memory for processing, the brain has concurrent access to that memory when connected neurons fire. While neuroscience generally accepts the theory of synaptic memory storage, we still lack a definitive understanding of how it works.

Even still, we understand that the brain handles multiple processes simultaneously. Human brains may not function with the numeric computational efficiency of a basic calculator by default, but it takes a supercomputer that requires enough electricity to power 10,000 homes to outpace it. The human brain sets a high benchmark for its artificial counterpart in processing power alone, and its significant architectural differences make its capabilities so much more dynamic that it makes these comparisons almost pointless.

The multinational research team that successfully replicated an artificial neurosynaptic network used four neurons and 60 synapses arranged with 15 synapses per neuron. On its own, the cerebrum contains billions of neurons. With an even larger number of synapses (a ratio of about 10,000:1), when you consider the difference in scale, it’s easy to see how this major accomplishment only serves as an initial step in a long journey.

But that doesn’t make the accomplishment any less impressive. The way the chip functions defies simple explanation, but in essence, it uses a series of resonators to guide incoming laser light pulses so they reach the artificial neuron as intended. Without proper wave guidance, the artificial neurons would fail to receive consistent input and have no practical computational function. Comprised of phase-change material, the artificial neurons change their properties in response to a focused laser and this process successfully imitates one tiny piece of the human brain.

Read more:
https://www.extremetech.com/computing/291627-new-photonic-microchip-mimics-basic-brain-function