Early Signals

“What are you doing?”, my wife asked. Most of the time I grunt something. I think of this question as more of a greeting than a question. This time, however, I said enthusiastically “I am reading about the Wiki Wiki Web”.  This cracked up the entire family (two kids included). Wiki Wiki means quick in Hawaiian. This was in the early 90s before the days of Wikipedia. I was browsing through c2 wiki.

Wikis became mainstream when Wikipedia became popular and now my family does not laugh at me anymore.

There are times you have a hunch about certain trends. I felt strongly about – Databases in the 80s, Wikis and application components in 90s, XML, Python in early 2000 and now ML and Chatbots. This resulted in my working on an SQL database engine in the mid-80s, database components in 90s, an XML chip in the 2000s and on Python since 2006. Now, it is ML and chatbots and Natural Language Processing.

Not every thing I was excited about became mainstream. RDF and Semantic Web, OLE from Microsoft,  Domain Specific Languages and Pattern Oriented Languages did not go very far.

Over a period, I have built a few thumb rules about paying attention to early signals in emerging technologies.

  1. What is the research behind the technology and how long has it been going on? For example, neural networks and ML are several decades old. AI has gone through several difficulties.
  2. What is the volume and velocity of research papers?
  3. Is government funding research in this space? (Internet, Page Rank algorithm, self driving cars and many others started as funded Government research).
  4. Who are the major companies involved in the early adoption of these technologies? For XML it was Microsoft, Sun Microsystems, and several others.
  5. What are pilot projects being done for commercialization and who is working on those?
  6. Who is hiring in this space?
  7. Which business publications are covering topics about this space?
  8. What companies are getting funded? Funding is both a leading and lagging indicator depending on who is funding and why they are doing it.
  9. How is the information about this space being propagated? Who is propagating it?
  10. What are the conversations going on in Twitter?
  11. Are there books on the subject? Books are most of the time lagging indicators.
  12. Are these technology topics being covered in conferences?

Some of these indicators are easy to find. You need to look for others.

7 Things I Learned from Listening to The Culture of Innovation Talk

I really enjoyed watching  “The Culture of Innovation” from MIT Technology Review.

The talk covers several interesting topics worth exploring.

  1. Permission less innovation and Innovation at the edges
  2. A culture of practice over theory
  3. The concept of Social Investing
  4. Connectivity in Communities
  5. Peripheral vision and Pattern Recognition and how they are the total opposite of focus and execution
  6. Attachment bias
  7. Cultures and sub-cultures

My favorite quote from the talk:

We so cherish focus, execution and they are the opposites of peripheral vision, pattern recognition
Peripheral vision and pattern recognition lead to discovering new ways of doing things.
Here is a link to the video interview with Joi Ito.

Where is Machine Learning Being Applied?

When I give talks on Machine Learning, I often get these questions:

  • What is Machine Learning?
  • What are some Machine Learning Applications?
  • Is Machine Learning Mature?
  • Who is using Machine Learning?
  • How do we get started?

If you are using Google or Bing Search, if you get recommendations for books or other products from Amazon, if you are getting hints for the next word to type on a mobile keyboard, you are already using Machine Learning.

Here is a sample list of Machine Learning applications.

From  Apple’s Core ML Brings AI to the Masses:

  • Real Time Image Recognition
  • Sentiment Analysis
  • Search Ranking
  • Personalization
  • Speaker Identification
  • Text Prediction
  • Handwriting Recognition
  • Machine Translation
  • Face Detection
  • Music Tagging
  • Entity Recognition
  • Style Transfer
  • Image Captioning
  • Emotion Detection
  • Text Summarization

From Seven Machine Learning Applications at Google

  • Google Translate
  • Google Voice Search
  • Gmail Inbox Smart Reply
  • RankBrain
  • Google Photos
  • Google Cloud Vision API
  • DeepDream

Also, see – How Google is Remaking Itself as a “Machine Learning First” Company.

While Apple, Google, Facebook, Amazon, IBM, and Microsoft are the most visible companies in the AI space, take a look at business applications of Machine Learning.

What is Artificial Intelligence (AI)?

Artificial Intelligence (aka AI),  will have a deep impact on our lives – both positive and negative.  Like any other tool or technology, a lot depends on how we use it.  I often get asked these questions:

  • What is AI?
  • What is good about it?
  • Will it destroy jobs?
  • Will it take over humanity?
  • What do we need to do to leverage AI?

AI traditionally refers to an artificial creation of human-like intelligence that can learn, reason, plan, perceive, or process natural language. These traits allow AI to bring immense socioeconomic opportunities, while also posing ethical and socio-economic challenges.

Right now the opportunities are in research, technology development, skill development and business application development.

The technologies that power AI – neural networks, Bayesian Probability, Statistical Machine Learning have been around for several decades (some as old as the late 50’s). The availability of Big Data is bringing AI applications to life.

There are concerns about misuse of AI and a worry that it may result in uncontrolled proliferation, killing jobs in its wake. Other worries include unethical uses, unintended biases, and other problems. It is too early to take one side or the other.

Please take a look at Artificial Intelligence and Machine Learning:  Policy Paper. It looks at AI from a variety of lenses.

ReadLog: When Leaders Think Aloud…

When leaders think aloud, it is a fascinating to listen. Satya talks about innovation, handling failures, AI, advances in cloud computing, using silicon to speed machine learning and a variety of other topics including bits of history (of Microsoft) and philosophy.

satya may 2 2017-1

Microsoft had been there, too early.  And they were too far behind on the Internet and managed to catch up.

On handling failures – instead of saying “I have an idea”, what if you said, “I have a new hypothesis”?

satya May 2 2017

Satya Nadella goes on to talk about some of their innovations (accelerating AI using FPGA), on investing in the future and the future of innovation. This article is a good read.

Q&A with Microsoft CEO Satya Nadella: On artificial intelligence, work culture, and what’s next

Insights into IOT

The Internet of Things, or IoT, may be the most important online development yet

First thing in a new technology, people do all the obvious things that look like the old market, but more efficiently. In the Internet, GNN had web ads like old newspaper ads. Later there was Google search, which was a different way of doing advertising, by focusing more on data. Now we’ve got social search, social networks. The business model moves to something that is more native to the technology.

Dart, Swift and Popularity of Big Data and Computational Statistics

Watching programming language popularity is one of my hobbies. The TIOBE index Nov 2014, shows some interesting trends. Let us take a look.

 

Click on these images to see a full page view.

TIOBE_2014

 

TIOBE2014-8-20

 

 

This para from the TIOBE is worth noting.

Thanks to the big data hype, computational statistics is gaining attention nowadays. The TIOBE index lists various of these statistical programming languages available, e.g. Julia (position #126), LabView (#63), Mathematica (#80), MATLAB (#24), S (#84), SAS (#21), SPSS (#104) and Stata (#110). Most of these languages are getting more popular every month. The clear winner of the pack is the open source programming language R. This month it jumped to position 12, while being at position 15 last month.

Other trends:

  1. The top 7 languages (from a year ago) retain their spots, but all of them drop a bit in popularity.
  2. Dart, a programming language from Google,  jumps into Top 20 from a previous rank of #81. Dart is language for  building web and cloud apps.
  3. Swift comes from nowhere and enters #18 spot. Swift is a new programming language from Apple for iOS and OS X.
  4. Perl and Visual Basic.NET stay in Top 10. It will be interesting to watch their moves.
  5. F# keeps moving up (from #23 to #16)
  6. Watch the Top 50 languages (#21-#50). Some of them are leading indicators to future of computing.
  7. To see potential new entrants into Top 20, you may want to watch the other languages in Top 50 in the  TIOBE site.
  8. I expected Scala to be in this list but for some reason, I don’t see it. I think it will soon move up into the Top 20 list.
  9. Three SQL dialects are still in Top 20. I am not surprised by that since SQL is still one of the most popular languages for database programming.
  10. I keep hearing a lot about Julia. I will be watching it with interest.

The images in this page are from InfoMinder. InfoMinder is a tool for tracking web pages. I use it to track a few interesting pages on the web. When InfoMinder detects change in a page, it highlights it  and creates a new changed page. It is one of the tools we built over a decade ago and is still chugging along, helping me and others watch the web.

Recommended Reading: What Will Our World Look Like in 2022?

Predicting the future is hard and risky. Predicting the future in the computer industry is even harder and riskier due to dramatic changes in technology and limitless challenges to innovation. Only a small fraction of innovations truly disrupt the state of the art. Some are not practical or cost-effective, some are ahead of their time, and some simply do not have a market. There are numerous examples of superior technologies that were never adopted because others arrived on time or fared better n the market. Therefore this document is only an attempt to better understand where technologies are going. The book Innovators Dilemma and its sequels best describe the process of innovation and disruption.

Nine technical leaders of the IEEE Computer Society joined forces to write a technical report, entitled IEEE CS 2022, symbolically surveying 23 potential technologies that could change the landscape of computer science and industry by the year 2022. In particular, this report focuses on:

  1. Security cross-cutting issues
  2. The open intellectual property movement
  3. Sustainability
  4. Massively online open courses
  5. Quantum computing
  6. Devices and nanotechnology
  7. 3D integrated circuits
  8. Universal memory
  9. Multicore
  10. Photonics
  11. Networking and inter-connectivity
  12. Software-defined networks
  13. High-performance computing (HPC)
  14. Cloud computing
  15. The Internet of Things
  16. Natural user interfaces
  17. 3D printing
  18. Big data and analytics
  19. Machine learning and intelligent systems
  20. Computer Vision and Pattern Recognition
  21. Life sciences
  22. Computational biology and bioinformatics
  23. Medical Robotics

You can find the comprehensive report here.

LinkLog: Smart and Connected Health Program

From an NSF request for proposal synopsis:

The goal of the Smart and Connected Health (SCH) Program is to accelerate the development and use of innovative approaches that would support the much needed transformation of healthcare from reactive and hospital-centered to preventive, proactive, evidence-based, person-centered and focused on well-being rather than disease. Approaches that partner technology-based solutions with biobehavioral health research are supported by multiple agencies of the federal government including the National Science Foundation (NSF) and the National Institutes of Health (NIH). The purpose of this program is to develop next generation health care solutions and encourage existing and new research communities to focus on breakthrough ideas in a variety of areas of value to health, such as sensor technology, networking, information and machine learning technology, decision support systems, modeling of behavioral and cognitive processes, as well as system and process modeling. Effective solutions must satisfy a multitude of constraints arising from clinical/medical needs, social interactions, cognitive limitations, barriers to behavioral change, heterogeneity of data, semantic mismatch and limitations of current cyberphysical systems. Such solutions demand multidisciplinary teams ready to address technical, behavioral and clinical issues ranging from fundamental science to clinical practice.

Due in large part to advances in high throughput and connective computing, medicine is at the cusp of a sector-wide transformation that – if nurtured through rigorous scientific innovation – promises to accelerate discovery, improve patient outcomes, decrease costs, and address the complexity of such challenging health problems as cancer, heart disease, diabetes and neurological degeneration.  These transformative changes are possible in areas ranging from the basic science of molecular genomics and proteomics to decision support for physicians, patients and caregivers through data mining to support behavior change through technology-enabled social and motivational support.  In addition to these scientific discoveries, innovative approaches are required to address delivery of high quality, economically-efficient healthcare that is rapidly becoming one of the key economic, societal and scientific challenges in the United States.

Discovered while doing some research on Smart Homes and Places.

IBM Watson – Augmenting Human Knowledge

Amazing! Between Watson, Siri and other similar Natural Language apps, we will be entering a new era of Knowledge Augmentation. I am especially thrilled about the impact it will have on teaching.

Watson looks at the question it is being asked and groups words together, finding statistically related phrases. Thanks to a massively parallel architecture, it then simultaneously uses thousands of language analysis algorithms to sift through its database of 15 terabytes of human knowledge and find the correct answer. The more algorithms find the same answer independently, the more a certain answer is likely to be correct. This is how, back in 2011, it managed to win a game of Jeopardy against two human champions.

In a presentation at the the Milken Institute Global Conference, IBM senior vice president and director of research John Kelly III demonstrated how Watson can now list, without human assistance, what it believes are the most valid arguments for and against a topic of choice. In other words, it can now debate for or against any topic, in natural language.

From a Gizmag article on IBM Watson