Lucky # 7 – avoiding information overload

I feel much more at ease than I did yesterday. The reason is that I just read a whitepaper of the Atos Scientific Community on ‘Information Overload’.

The paper explains that in general we humans can only remember 7 things – what a relief! Trying to remember more creates stress.

Young Girl Talking On Telephone (by David Castillo Dominici at freedigitalphotos.net)

So I decided to stop trying.

Human beings have clear limits on the amount of information they can process, often called bounded rationality. The phenomenon is clearest in the ‘magical number’ that is linked to our short-term memory: at most, we can keep seven (+/- two) items at once in our working memory, (…). Any item beyond seven, causes the added item (…) to be partially ignored, forgotten, distorted, or otherwise lost.

Suffering from Information overload is clearly a choice. If you suffer from it, it is because you fail to recognize your boundaries and/or, in relationship to that, fail to use the proper tools to manage your work.

In many corporations we see that there is a fundamental shift in the way information gets distributed and used for collaborative purposes – moving away from email and ‘the corporate newsletter’ (the push mechanisms) into a more ‘social’ network of information sharing and collaboration.

These Enterprise Social Networks (ESN) are modelled after public social networks as Facebook, Twitter and others and are being explored as alternative ways of working in enterprises.

Within a corporate social network, much of the potential for Information Overload can be avoided if the phenomenon is kept in mind during the design, creation and use of the network.

So the premise of the paper is that a properly designed and implemented ESN can help avoid Information Overload and increase efficiency in the workplace, but if done incorrectly can increase the problem and create a very unhealthy and inefficient working environment.

In order to understand the success-factors, we need to understand human behavior and make sure they get addressed:

  • Noise filtering; the capability to assess and reflect on information and filter the data in a way it makes sense for the work we have to do.
  • Predictability; the fact that standardization and methodologies help us structure the way we work, focusing on the content instead of the process increases motivation and quality of work.

If the ESN allows us to address these elements we have a good chance of success and we can further increase the quality by understanding how an ESN will impact the way we work.

The concept that every message must be seen by everyone, and that everything that is said on a corporate social network is relevant to every user has become outdated in the social media era.

I believe this is a fundamental characteristic of what we will be able to achieve through a successful implementation of an ESN; empowerment of people.

Moving away from the top-down pushing of information, but instead creating cross company collaboration processes in (virtual) communities.

 


This blog post was previously published at http://blog.atos.net/blog/2013/09/03/watch-this-space-lucky-7-avoiding-information-overload/


The consequences of a stolen phone

Your wallet is stolen. You wanted to pay for your tall latte and it is gone. You search all of your pockets and looked around, bewildered.

Maybe somebody found it and will hand it to you. No. It’s gone. The nice lady at the counter understands and gives you your coffee anyhow. That’s nice.

 

Cafe Latte (by amenic181 at freedigitalphotos.net)It doesn’t change the fact that your wallet is stolen. In your mind you create a list of everything that is in it. Some money, the tickets for the theater, your bank card and your credit card, some pictures of your children and a business card you got while you bumped into an old friend on your way to the coffee shop.

So now you reach for your phone, you have to call the bank to block your cards, you do not want some punk to get his dirty hands on your salary.

Oh wait and sh@#$%^&*, your phone is gone too…

In a recent white paper of the Atos Scientific Community the security aspects of mobile devices is addressed , as well as other aspects in the management of devices in the new bring-your-own-device concept that is being allowed by many companies and full heartedly embraced by employees.

The quotes below are from that white paper.

“Enterprise Mobile Management solutions currently available in the market address different aspects of BYO. Balancing those with network & access as well as data and applications usage will pave the way for a successful BYO implementation…”

Ok. It is gone, you do a quick mental inventory of what is on your phone.

Access to your personal and business email, Twitter and Facebook account. Your contact list of about 400 people with their email addresses, home addresses and telephone number included. On top of that access to your DropBox account with all the info on a recent bid and the complete cost break down of all products.

And because you have a new NFC enabled phone, your credit card is also in digital format on your phone. Now what?

“The key area to support BYO in 2016 will be tablets and their descendants (e.g. wearable computers), along with smartphones. We see these as the two key device segments.”

The white paper does not only cover this case of a stolen phone – it goes into all measures you can take if you adopt the bring-your-own-device scenario in your company.

What to do with applications, data and network access; all these aspects are clearly explained and some best practices are listed for any CxO that is looking into this.

“Security in such dynamic environments as BYO must be built on the assumption that anyone or any device may get access to the data, but that only authorized users should be able to use it for the intended and agreed purpose, and under a defined context.”

“Sir? Is this yours?” When you turn around you see a nice person holding up both your phone and wallet – you start breathing again.

At the same time you think about what you could do to avert the disaster that did not happen this time.

 

 


This blog post was previously published at http://blog.atos.net/blog/2013/06/24/watch-this-space-the-consequences-of-a-stolen-phone/


Curiosity drives cloud computing

I like asking questions and I like getting good answers even better. It is because of that, I now have a love / hate relationship with search engines. Most of the time they give me a 50% answer, a kind of direction, a suggestion, a kind of coaching to the real answer. It is like the joke about the consultant; “the right answer must be in there somewhere, because he or she gives me so many responses”.

PH03797IIn spite of all kind of promises, search engines have not really increased their intelligence. Complex questions with multiple variables are still nearly impossible to get answered and the suggestions to improve my question are mostly about my spelling or because the search engine would have liked a different subject to be questioned on.

So nothing really good is coming from search engines then? Well most arguably search engines have brought us cloud computing and a very powerful access to lots and lots and lots of data, otherwise known as ‘the world wide web’.

No wonder I envision that powerful access and cloud computing are the two most important values we want to keep while increasing the capacity and intelligence to do real analytics on large data sets.

In a whitepaper of the Atos Scientific Community, these 2 elements are explored in great depth:

  • Data Analytics needs cloud computing to create an “Analytics as a Service” – model because that model addresses in the best way how people and organizations want to use analytics.
  • This Data Analytics as a Service – model (DAaaS) should not behave as an application, but it should be available as a platform for application development.

The first statement on the cloud computing needs suggests we can expect analytics to become easily deployed, widely accessible and not depending on deep investments by single organizations; ‘as a service’ implies relatively low cost and certainly a flexible usage model.

The second statement about the platform capability of data analytics however, has far reaching consequences for the way we implement and build the analytic capabilities for large data collections.

Architecturally, and due to the intrinsic complexities of analytical processes, the implementation of DAaaS represents an important set of challenges, as it is more similar to a flexible Platform as a Service (PaaS) solution than a more “fixed” Software as a Service (SaaS) application

It is relatively easy to implement a single application that will give you an answer to a complex question; many of the applications for mobile devices are built on this model (take for example the many applications for public transport departure, arrival times and connections).

This “1-application-1-question” approach is in my opinion not a sustainable business model for business environments; we need some kind of workbench and toolkit that is based on a stable and well defined service.

The white paper describes a proof of concept that has explored such an environment for re-usability, cloud aspects and flexibility. It also points to the technology used and how the technology can work together to create ‘Data Analytics as a Service’.


This blog post was previously published at http://blog.atos.net/blog/2013/03/25/watch-this-space-curiosity-drives-cloud-computing/


<

A new business model in 3 easy steps

If you like curly fries you are probably intelligent (1).

This insight comes from the University of Cambridge. The researchers analysed the data from Facebook to show that ‘surprisingly accurate estimates of Facebook users’ race, age, IQ, sexuality, personality, substance use and political views can be inferred from the analysis of only their Facebook Likes’.

The possibility to collect large amounts of data from everyday activities by people, factory processes, trains, cars, weather and just about anything else that can be measured, monitored or otherwise observed is a topic that has been discussed in our blogs many times.

Sometimes indicated as ‘The Internet of Things’ or, with a different view ‘Big Data’ or ‘Total Data’, the collection and analysis of data has been a topic for technology observations and a source of concern and a initiator for new technology opportunities.

This blog is not about the concerns, nor is it about the new technologies. Instead it is about a view introduced by a new white paper by the Atos Scientific Community called “The Economy of Internet Applications”; a paper that gives us a different, more economic, view on these new opportunities.

Let’s take a look at a car manufacturer. The car he (or she) builds will contain many sensors and the data from those sensors will support the manufacturer to enable better repairs for that one car, it can provide data from many cars for an analysis to build a better car in the future and it can show information to the user of the car (speed, mileage, gas). The driver generates the data (if a car is not driven, there is no data) and both the driver and the car manufacturer profit from the result.

Now pay attention, because something important is happening: When the car manufacturer provides the data of the driver and the car combined to an insurance company, a new business model is created.

The user still puts in the data by using the car, the car manufacturer sensors in the car still collects the data, but the insurance company gets the possibility to do a better risk analysis on the driver’s behaviour and the cars safety record.

This would allow the insurance company to give the driver a better deal on his insurance, or sponsor some safety equipment in the car so there is less risk for big insurance claims in health or property damage.

It would allow the car manufacturer to create more value from data they already have collected and it would give the driver additional benefits in lower insurance payments or improved safeties.

What just happened is that we created a multi-sided market and it is happening everywhere.

“If you don’t pay for the product, you are the product”

The white paper explains it in more detail but the bottom line is that due to new capabilities in technology, additional data can easily be collected.

This data can be of value for different companies participating in such a data collection and the associated analytics platform.

Based on the economic theory of multisided markets, the different participants can influence each other in a positive way, especially cross sector (the so called network effect).

So there you have it, the simple recipe for a new business model:

  1. Find a place where data is generated. This could be in any business or consumer oriented environment. Understand who is generating the data and why.
  2. Research how: a. that data or the information in that data, can give your business a benefit and b. how data that you own or generate yourself, can enrich the data from the other parties.
  3. Negotiate the usage of the data by yourself or the provisioning of your data to the other parties.

In the end this is about creating multiple win scenarios that are based on bringing multiple data sources together. The manufacturer wins because it improves his product, the service provider wins because it can improve the service and the consumer wins because he is receiving both a better product and a more tailored service.

Some have said that Big Data resembles the gold rush (2) many years ago. Everybody is doing it and it seems very simple; just dig in and find the gold – it was even called ‘data-mining’.

In reality, with data nowadays, it is even better, if you create or participate in the right multi-sided market, that data, and thus the value, will be created for you. 

(1) http://www.cam.ac.uk/research/news/digital-records-could-expose-intimate-details-and-personality-traits-of-millions

(2) http://www.forbes.com/sites/bradpeters/2012/06/21/the-big-data-gold-rush/


This blog post was previously published at http://blog.atos.net/blog/2013/03/18/watch-this-space-a-new-business-model-in-3-easy-steps/


Would you like a cup of IT

The change in the IT landscape brought about through the introduction of Cloud Computing is now driving a next generation of IT enablement. You might call it Cloud 2.0, but the term 'Liquid IT' much better covers what is being developed.

In a recently published white paper by the Atos Scientific Community, Liquid IT is positioned not only as a technology or architecture; it is also very much focused on the results of this change on the business you are doing day to day with your customer(s).

"A journey towards Liquid IT is actually rather subtle, and it is much more than a technology journey"

The paper explains in detail how the introduction of more flexible IT provisioning, now done in real time allows for financial transparency and agility. A zero latency provisioning and decommissioning model, complete with genuine utility pricing based on actual resources consumed, enables us to drive the optimal blend of minimizing cost and maximizing agility. Right-sizing capabilities and capacity all of the time to the needs of the users will impact your customer relationship – but, very important, designing such a systems starts with understanding the business needs.

"Liquid IT starts from the business needs: speed, savings, flexibility, and ease of use"

Existing examples of extreme flexibility in IT (think gMail, Hotmail or other consumer oriented cloud offerings) have had to balance between standardization and scale. The more standard the offering, the more results in scaling can be achieved. This has always been a difficult scenario for more business oriented applications. The paper postulates that with proper care for business needs and the right architecture, similar flexibility is achievable for business processes.

Such a journey to 'Liquid IT' indeed includes tough choices in technology and organization, but also forces the providers of such an environment to have an in-depth look at the financial drivers in the IT provisioning and the IT consumption landscape.

"The objectives of financial transparency dictate that all IT services are associated with agreed processes for allocation, charging and invoicing"

There are two other aspects that need to change in parallel with this move to more agility in IT; the role of the CIO will evolve and the SLA that he is either buying or selling will change accordingly.

Change management will transform into Information Management as the use of IT as a business enabler is no longer the concern of the CIO. IT benchmarking will become a more and more important tool to measure the level of achieved agility for the business owners. The focus on the contribution to the business performance will be measured and needs to be managed in line with business forecasts.

The white paper authors conclude that "Business agility is the main result of Liquid IT" – sounds like a plan!

This blog post was previously published at http://blog.atos.net/blog/2013/03/08/watch-this-space-would-you-like-a-cup-of-it/