Ted Talk – Shyam Sankar: The rise of human-computer cooperation

We are in the era of fusion of machine and human, this fusion not only will be a evolved biological entity, we are talking about a much more complex symbolic field than the human mind can imagine, the expansion of consciousness is a reality and the two forces have begun to collaborate with each other.

EMR systems deliver better health care – but not necessarily from massively expensive systems.

Breakthrough solutions at a clinic or health facility level will come from more modest EMR systems. In the Long Term Care sector for example, simpler, smaller highly focused EMR systems allowing facility managers, healthcare professionals, patients, and their families to track healing processes will do the most good.

Its about minimizing the friction between people and the system. Watch the talk to understand this idea.


Monetizing Big Data

big data 3







We came across this article by Scott Stainken, General Manager, Global Telecommunications Industry, IBM. It our intent to cover “Big Data” on this blog. Your comments are appreciated.

In the history of business, data has never been more important than it is today. Knowledge has always been key, but we’ve never before been able to access, manage and act on so much real-time information.

The systems we are developing and deploying all over the world are helping enterprises understand and use all this ‘big data’ to make really smart decisions.

Click here for more.

For definitional purposes see the reference on Big Data from Wikipedia below:
Big data is a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing applications. The challenges include capture, curation, storage, search, sharing, transfer, analysis, and visualization. The trend to larger data sets is due to the additional information derivable from analysis of a single large set of related data, as compared to separate smaller sets with the same total amount of data, allowing correlations to be found to “spot business trends, determine quality of research, prevent diseases, legal citation analysis, combat crime, and determine real-time roadway traffic conditions.

As of 2012, limits on the size of data sets that are feasible to process in a reasonable amount of time were on the order of exabytes of data. Scientists regularly encounter limitations due to large data sets in many areas, meteorology, genomics, connectomics, complex physics simulations, and biological and environmental research.  The limitations also affect Internet search, finance and business informatics. Data sets grow in size in part because they are increasingly being gathered by ubiquitous information-sensing mobile devices, aerial sensory technologies (remote sensing), software logs, cameras, microphones, radio-frequency identification readers, and wireless sensor networks. The world’s technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s; as of 2012, every day 2.5 quintillion (2.5×1018) bytes of data were created. The challenge for large enterprises is determining who should own big data initiatives that straddle the entire organization.

Big data is difficult to work with using most relational database management systems and desktop statistics and visualization packages, requiring instead “massively parallel software running on tens, hundreds, or even thousands of servers”. What is considered “big data” varies depending on the capabilities of the organization managing the set, and on the capabilities of the applications that are traditionally used to process and analyze the data set in its domain. “For some organizations, facing hundreds of gigabytes of data for the first time may trigger a need to reconsider data management options. For others, it may take tens or hundreds of terabytes before data size becomes a significant consideration.

On innovation

big data 3

A recent George Brown College report titled Toronto Next: Return on Innovation explains why companies struggle with the concept of innovation. We are most likely to think of innovation as creating new things, rather than creativity to solve problems. However, the study shows that innovation is best tied to concrete and measurable outcomes, especially making a an existing business or technology process more efficient or effective. Innovation as it turns out is about solving concrete problems.

One area we are interested in, suggested by the study, is creating data roadmaps to achieve innovation. Big data is expected to go from a $200 million industry in 2011 to a $3 billion industry in 2013. To our interest, there are many opportunities for marketing folks leverage this massive business analytics opportunity. We are actively building a data science resource to bring this capability to our core social media marketing efforts. Stay tuned.

Click here for more on the report.

Click here for a white paper entitled: 7 Tips to Succeed with Big Data in 2013 from Tableau Software.