My wife and I are avid campers. We spend at least 6 weeks a year in various campgrounds in Europe and The United States. Our preferred way of camping is with a tent and we like to stay in ‘rural’ campsites. In the USA that means campground in national and state parks, in Europe we avoid the big campsites and look for smaller privately owned sites.
We have a small tent that has room for three, so we can fit 2 beds and our luggage easily. Up until last year we slept on self-inflatables and that worked out fine until I noticed to get some trouble with my lower back (I am 51 years old). It would hurt when I got up in the morning and immediately start hurting again when we went back to bed on the 2nd or 3rd day.
So I started looking for alternatives and came across the Thermarest LuxuryLite Cot. More specifically the extra-large variant since I am quite large (193 cm) and because this version is also extra wide, it would allow me to sleep on my side, which is my preferred position.
Now, you need to know that this cot is expensive! I ordered it at CAMPZ.NL and the price-tag was over 225 euro’s – which is a lot, but it came with a 100 day return guarantee, so I would be able to test it thoroughly.
Did it work? The answer is wholeheartedly ‘Yes’ – I used the cot during a 4 week camping trip and spend almost 24 nights on this bed (other nights were spend in a hotel for the necessary showers and laundry facilities) – in over 6 different campsites.
Now it would be strange if I would say that not spending that time in my own bed did not affect me, but overall I slept great and had no back pain to speak of.
The cot consists of a sturdy cloth, 2 foldable sticks that slide into the long side and a set of smaller sticks and ‘feet’ that allow you to build the cot. It is almost impossible to explain in text, so I refer to the YouTube video for an explanation how to set this thing up ( http://youtu.be/orRtuiTgQ40 )
The cot comes in a small bag, that holds all of the components easily and the complete package is surprisingly light (less than 2 kg). Setting up takes some effort and strength in your upper arms.
After the first week I decided to change the way that the struts and feet are located – I used to evenly spread them across the length of the bed, but I discovered that putting 2 close together to support my head and neck, leave a bigger gap and then put in the rest (I suggest you look at the video, it will make more sense than my textual description here), was a better setup.
After 6 weeks the bed is still in perfect condition, but I do have some concerns about the holes in the cloth where you fix the ‘feet’; due to the tension that is put on the holes, I expect that this is an area that will suffer from the setup process. Also I noticed that this is not a very sturdy bed, getting in and out of it needs to be done with some care – you can absolutely forget to do anything else then sleep in this bed.
All in all I am happy with my purchase and hope we can enjoy camping for a long time.
Platforms, or ecosystems, are the virtual malls of the (near) future. We have the Google, Apple and Microsoft platform – although one could argue that these three are already surpassed by the likes of Facebook and Amazon, who put the big three in the (undesired) corner of ‘technology providers’.
Let me explain; a mall is a location where different vendors and providers come together and each contributes to the overall experience of the customer, while retaining their own business model. Some malls put this under 1 brand, creating a store-in-store concept, while others are more like traditional markets where farmers used to come together to sell their produce.
“The connected train is the monetization of high bandwidth Internet on a moving train where data and transactions are facilitated via a platform.
… to the platform provider(s), it’s the ability to harvest and sell passenger data and facilitate business transactions.”
Bringing technology into places that previously had almost no technology and using that technology to specifically address the circumstances of the consumer, client, traveler, patient, student, or whatever other role or identity people will take, is a great opportunity to both providers and consumers of services.
Before going into the specific value of such a ‘Connected Train’ platform, the authors address the necessary aspects that need to be understood before a platform can be instantiated:
What is the value chain?
Who are the groups that interact at the platform?
What is the role of the technology, what does it make possible?
How does the platform make money?
How will it work?
The authors treat each question, but go a little deeper when looking at the necessary technology; mainly I presume because this has been the biggest bottleneck so far in creating this platform. New developments in network technology and continuous improvement in available bandwidth are also expected.
“In 2018 the day–‐to–‐day technology of passengers will be very different. We can expect to see mass adoption of wearable computing, the Internet of Things, IPv6 and heads–‐up display technology (Google glasses). All of these new usages of the Internet will consume considerable bandwidth. Assuming that Moore’s law applies in order to meet passengers’ demands the connected train must offer 12 Mb/s to each passenger.”
Obviously, when such a platform is finally available many parties will be able to participate and some examples are explained in the paper. Next to looking at movies, order a meal or make an advanced restaurant reservation, I found the possible interaction with train staff most interesting.
Because of the platform nature, for the staff, the passengers are no longer anonymous travelers, but can be addressed as returning customers or passengers with special needs. Information can become tailored and customer remarks (complaints?) can be dealt with much quicker and on a personal basis.
“This is also an opportunity to upsell products that fit into the TOC service system such as parking or concession. There is an opportunity to tell stations about trains which are overcrowded before they arrive. This gives passengers the knowledge on which to base a decision as to whether to get on this train or wait for the next one.”
It is clear that still some big technology challenges need to be overcome and also aspects of privacy and pricing structures need to be addressed. But it is becoming clear that special purpose or special location platforms are a great way to bundle a wide experience of services on a technology foundation. It allows for collaboration between vendors, create added value for the platform provider and great benefits for its users.
And, after reading the whitepaper, it is obvious that platforms and trains are made for each other.
Imagine you have an automatically and real-time updated agenda – it continuously adapts your schedule to meetings taking longer, predicts and updates in real-time your travel-time to the next meetings and will adapt your schedule because it ‘knows’ that typically any meeting with your best client always takes 30 minutes longer than you originally plan it for.
A proof of concept conducted by the Atos Scientific Community looked at this aspect of predictability and took the data of the traffic in the city of Berlin to see if it was possible to do real time traffic forecasting (RTTF). The result is in a recently published white paper.
“RTTF enables a prediction (within 1 minute) of sensor data streams for the immediate future (up to four hours) and provides traffic condition classification for the upcoming time period based on the forecasted data.”
“The forecast provides a suitable time span for proactively managing upcoming incidents even before they appear.”
The team took a radical different approach to the challenges of today’s traffic management. Instead of proposing another reactive traffic management IT system with some smart analytics, the team targeted successfully a proactive traffic management approach which provides analytics solutions to predict critical events in advance before they appear. Using historic data and artificial neuron network technology, predictions are created for the intermediate future and utilized to determine the traffic status of the upcoming next four hours. Based on that information, actions can be taken proactively to mitigate or avoid future upcoming events. Utilizing the software and bringing in data scientists with an understanding of the context was the next step. This helped in defining the right parameters and a pattern based strategy (PBS) in place.
“Being able to identify patterns out of the existing data, model them into patterns and come up with a system that can provide reliable predictions is a remarkable achievement in itself, but the true value of PBS is being able to apply such capabilities to strategy definition and decision making.”
Working with the subject matter experts the team identified multiple models that were then consequently implemented in the software. The models are important, they avoid that you are trapped into simplification; when a car is driving slowly, it can be because of a traffic jam, but it can also be an older person driving more carefully.
By introducing the concept of ‘flow’ – the number of vehicles passing a sensor each hour – the team could identify 4 different states, which were in themselves also parameterized by looking at road capacity, speed limits, etc. This information is then fed into a look-up table based complex event processing engine in order to predict, within 1 minute, the traffic situation at given locations.
Because in real-life the historic data is continuously refreshed with the actual events of the past time, the system will be able to predict in real-time the situation on the road.
The proof of concept clearly showed that a self-learning system, combined with a complex event processing unit and the help of some subject matter expert data scientist can accurately predict the future – the white paper shows this in some great details.
“Real Time Traffic Forecasting is an excellent example of how data sources and identified patterns can be exploited to gain insights and to develop proactive strategies to deal with upcoming events and incidents. It enables a short term view into the future which is long enough to act on predicted incidents rather than react on occurring ones”
For me this proof of concept shows the benefits of data analytics in everyday life, and I am looking forward to this future.
There are 3 reasons I never go shopping without my smart phone; first I need to be able to compare the price of what is on sale with the price I would pay elsewhere, secondly, I like to see a review of the product on-line and thirdly I need to be able to call my wife when I am in doubt about what kind of groceries, or some other unknown item written on her shopping list (female hygiene products are always challenging for me).
“The shopping experience has suffered a dramatic change over the last decades. Offers are larger and more diversified than ever, globalization is a reality and e-commerce is growing exponentially. Buyers are more demanding, discerning and sophisticated while the traditional selling models are not good enough to secure a sustainable sales flow.”
This change in shopping, fueled by mobile technologies and a much deeper understanding of the customers behaviors and demands is the scope of a white paper download, called “The Future of In-Store Shopping”.
Physical shopkeepers, as explained in the paper, are increasingly under pressure to compete with the e-commerce world in order to provide an experience that has the same convenience of shopping on-line while at the same time offer the intimacy and customer satisfaction of getting to touch and discuss a product.
“The answer lies in putting the customer at the center of the value chain through an enhanced shopping experience. Whenever customers interact with the commerce, a new opportunity arises to know them better and offer a more personalized service, which could extend up to negotiating prices on a one-to-one basis.”
New shopping models will be needed to capture the client and bring the value of being in the shop, while at the same time the convenience of electronic payment and delivery is combined with the physical shop experience. Possible scenario’s include personalization but also increase the experience through show casing of product ranges and providing expert support during the decision making process.
The reason for being in a store can be further enhanced by making it part of a full end-to-end experience that can even start before you go into the shop. Something we used to do by sending around leaflets of this week’s offerings, but can now become a much more sophisticated and personal experience through data analytics of previous purchases or engaging the customer in communities – this ‘value-flow’, that can even include a post-shopping experience, is explained in detail and allows you to understand how you can set this up yourself.
“The better the retailers take care after a purchase, considering it the ‘purchase before the next purchase’, the more likely they are to have won happy and frequent customers.”
Technology will support this change. New payment methods, using mobile devices (we have talked about this before in my blogs and a white paper dedicated to mobile payments is also available) are increasingly available. But other technologies such as geo-location and in-store routing allow consumers to find stores and even navigate to specific locations inside the store. Big Data Analytics and all types of product identification through smart labelling, NFC or bar codes will help us track both the consumer and the products inside the store and beyond. Better and ‘always-on’ connectivity will support high enough bandwidth to enrich the physical product with lots of additional (meta-) data to give the customer even more information.
“Initially consumers will start using basic functionalities (find a store, make a shopping list, get product information, etc.) and once they feel confident and see the value, they will access more complex functionalities (make a shopping basket, self-checkout, mobile payment, cloud tickets, etc.). It is important that all these functions are easy to use and they are designed with the consumer at the center, hiding the complexity of the technologies being used (NFC, image recognition, indoor location, etc.)”
And when we look further in the future we will see possibilities for consumers to get access to the full product life cycle – where was this chair made, what is the origin of this coffee, what are the ingredients of this pizza? The full ecological footprint will be available regarding the actual product you are touching and putting in your basket. On top of that, using augmented reality the shop can adapt itself to your mood, informing the staff that you are open for suggestions or want to be left alone.
“Ultimately, what will make stores interesting in the future is the same thing that makes them interesting today: the physical experience of being there, talking to real people who know their products, touching such products and the unbeatable joy at leaving the store with the product in your hands.”
The paper gives you a comprehensive overview and is a good starting point to understand how customer expectations, technology and the way retailers like to organize their physical business comes together. And this is not far away in the future as I experienced recently when my favorite on-line retailer just now opened a physical store in my home town – interestingly the location of the store was the result of asking their on-line customers to find the best spot for them. I’m sure they saved a lot of money because they did not need to hire a specialist, locating the perfect location was outsourced to their customers – in my book, that is clever thinking.
I am a news junkie; I eat, drink, snack, swallow and dine copiously on any news source. My starter is the newspaper in the morning, followed by a quick look at some of my favorites online. During the day, when work allows it, I will visit some other sites and during lunch I might have a second look at the morning newspaper. The evening paper I read after dinner and around 8 or 10, I will watch the evening news on television. Just before turning in, I will check my usual favorite websites again. About 3 or 4 times a week I will check out new background stories on YouTube, TED or some local news sites – they will mostly serve the news in a video format, which is a good break from just reading about stuff.
Still, I am apparently an old fashioned guy:
“Smart mobility is opening up the media market in two dimensions. It is enabling personalized engagement with audience segments previously un-reached, and it is creating the opportunity for a near unlimited range of multi-screen services that enable the users to interact via the second screen.”
In a white paper published by the Atos Scientific community about disruptive changes in media, an overview is given of the impact of these changes and the increased use of smart mobile devices is the first one mentioned; I myself still like the paper format of the news, but am also increasingly drawn to using my phone or tablet.
“Socially connected dynamic content creates the opportunity for mass media experiences that are unique to any social graph.”
Secondly the authors indicate a strong increase in the interactions between producers and consumers of news. This need for direct interaction was already existing with radio – many “shock-jocks” have chosen this format to increase the impact of their radio-shows in the past, but the social interaction allows for a much larger amount of interactions and sometimes, through the interactions, creates its own new news stories. We have seen this when web logs publish videos of a bank-robber or some hooligan beating up innocent people and the readers actively participate to find the identity of these persons.
“Any individual has the opportunity to become their own broadcaster, and there are millions of examples of successful user generated channels (…). In this new world, the sole barriers to entry are an idea and basic production skills.”
Thirdly the paper explains the impact of user generated content. This used to be a very modest part of the media landscape and most often initiated by the professionals – for example CNN or BBC asking their viewers to upload pictures and movies, but is now exploding into semi-professional channels on video services like YouTube and Vimeo. With the rise of consumer friendly video equipment paired with HD quality, it is no longer expensive to be a creator and I expect that when technologies like Google Glass become mainstream we will see (no pun intended) an ever bigger growth in user generated content.
The paper shows at least 4 more disruptive changes in the short/medium term, which you will need to discover when it is finally published (hint: Intellectual property, real time advertising, personalization, network capacity).
For anybody looking at the next big thing, the new 'killer app' or the new gold, I recommend to read a white paper by the Atos Scientific Community called "IPv6: How Soon is Now?".
The paper explains very well the problem with the way the internet is currently working. It points out that we have a serious issue, a 'time-bomb', with the way that devices (computers, networking components and other IT stuff) are connected with each other using this old IPv4 technology. The paper further explains why, in spite of all kinds of intermediate technologies, we need to adopt a new technology, called IPv6, and we need to do that very quickly.
"To sustain the expected Internet growth, there is no adequate alternative to adopting IPv6."
Furthermore you will read in the paper that we will be running into real problems if we do not make that change and unfortunately the change is happening much too slow.
"Unfortunately statistics from Google paint a (…) picture with less than 1% of total users being IPv6 clients"
This might sound awfully boring and a field of play for the technology wizards in your organizations – this is not for you right? But wait, because halfway through the paper, the authors start explaining that the benefit of this new technology is in the way it can support all possible technical devices (including cars, phones, traffic lights, wind mills, televisions, your shoes and wrist watch, medical devices and almost anything) can become connected – can talk with each other – when we switch to IPv6.
"(…) that IPv6 can now be used on virtually any communicating object, from servers to small sensors, regardless of the underlying (…) network technology."
I think this changes everything; it opens up a whole new world of play for consumers and manufacturers, for service providers and retailers; to create new businesses, to open up new markets and create new ways of making money.
"The IPv6 "Killer App" is likely to be the enablement of the Internet of Things (IoT)"
Based on this you would be stupid to not support this move to IPv6; it will be the engine that allows your business to innovate and grow; your IT landscape will increase thousand fold and you can bring any type of information, sensor or other device into your business platform. That is cool and exciting.
But it will not be easy.
"Although many people think that a migration to IPv6 is primarily a networking issue, the truth is that all IT organizations across server, network, storage and application domains must be equally trained to contribute to both the planning and execution."
The authors explain in quite some detail that you will need to overcome technical hurdles (IP Space Management, IP Address Provisioning, IPv6 to IPv4 interoperability, Application IPv6 readiness and Security Challenges) as well as business challenges (Coordination across silos and companies, Timing issues on what to do first and governance to establish End-to-end responsibility).
"We predict a tipping point when there will be more IPv6-connected users and devices, and therefore opportunity, than the IPv4 landscape provides today."
So, want to grow your business, do the strategically right thing and set yourself up for business growth, agility and all the other stuff you need and like? Migrate to IPv6 now.
The paper explains that in general we humans can only remember 7 things – what a relief! Trying to remember more creates stress.
So I decided to stop trying.
“Human beings have clear limits on the amount of information they can process, often called bounded rationality. The phenomenon is clearest in the ‘magical number’ that is linked to our short-term memory: at most, we can keep seven (+/- two) items at once in our working memory, (…). Any item beyond seven, causes the added item (…) to be partially ignored, forgotten, distorted, or otherwise lost.”
Suffering from Information overload is clearly a choice. If you suffer from it, it is because you fail to recognize your boundaries and/or, in relationship to that, fail to use the proper tools to manage your work.
In many corporations we see that there is a fundamental shift in the way information gets distributed and used for collaborative purposes – moving away from email and ‘the corporate newsletter’ (the push mechanisms) into a more ‘social’ network of information sharing and collaboration.
These Enterprise Social Networks (ESN) are modelled after public social networks as Facebook, Twitter and others and are being explored as alternative ways of working in enterprises.
“Within a corporate social network, much of the potential for Information Overload can be avoided if the phenomenon is kept in mind during the design, creation and use of the network.”
So the premise of the paper is that a properly designed and implemented ESN can help avoid Information Overload and increase efficiency in the workplace, but if done incorrectly can increase the problem and create a very unhealthy and inefficient working environment.
In order to understand the success-factors, we need to understand human behavior and make sure they get addressed:
Noise filtering; the capability to assess and reflect on information and filter the data in a way it makes sense for the work we have to do.
Predictability; the fact that standardization and methodologies help us structure the way we work, focusing on the content instead of the process increases motivation and quality of work.
If the ESN allows us to address these elements we have a good chance of success and we can further increase the quality by understanding how an ESN will impact the way we work.
“The concept that every message must be seen by everyone, and that everything that is said on a corporate social network is relevant to every user has become outdated in the social media era.”
I believe this is a fundamental characteristic of what we will be able to achieve through a successful implementation of an ESN; empowerment of people.
Moving away from the top-down pushing of information, but instead creating cross company collaboration processes in (virtual) communities.
Your wallet is stolen. You wanted to pay for your tall latte and it is gone. You search all of your pockets and looked around, bewildered.
Maybe somebody found it and will hand it to you. No. It’s gone. The nice lady at the counter understands and gives you your coffee anyhow. That’s nice.
It doesn’t change the fact that your wallet is stolen. In your mind you create a list of everything that is in it. Some money, the tickets for the theater, your bank card and your credit card, some pictures of your children and a business card you got while you bumped into an old friend on your way to the coffee shop.
So now you reach for your phone, you have to call the bank to block your cards, you do not want some punk to get his dirty hands on your salary.
Oh wait and sh@#$%^&*, your phone is gone too…
In a recent white paper of the Atos Scientific Community the security aspects of mobile devices is addressed , as well as other aspects in the management of devices in the new bring-your-own-device concept that is being allowed by many companies and full heartedly embraced by employees.
The quotes below are from that white paper.
“Enterprise Mobile Management solutions currently available in the market address different aspects of BYO. Balancing those with network & access as well as data and applications usage will pave the way for a successful BYO implementation…”
Ok. It is gone, you do a quick mental inventory of what is on your phone.
Access to your personal and business email, Twitter and Facebook account. Your contact list of about 400 people with their email addresses, home addresses and telephone number included. On top of that access to your DropBox account with all the info on a recent bid and the complete cost break down of all products.
And because you have a new NFC enabled phone, your credit card is also in digital format on your phone. Now what?
“The key area to support BYO in 2016 will be tablets and their descendants (e.g. wearable computers), along with smartphones. We see these as the two key device segments.”
The white paper does not only cover this case of a stolen phone – it goes into all measures you can take if you adopt the bring-your-own-device scenario in your company.
What to do with applications, data and network access; all these aspects are clearly explained and some best practices are listed for any CxO that is looking into this.
“Security in such dynamic environments as BYO must be built on the assumption that anyone or any device may get access to the data, but that only authorized users should be able to use it for the intended and agreed purpose, and under a defined context.”
“Sir? Is this yours?” When you turn around you see a nice person holding up both your phone and wallet – you start breathing again.
At the same time you think about what you could do to avert the disaster that did not happen this time.
I like asking questions and I like getting good answers even better. It is because of that, I now have a love / hate relationship with search engines. Most of the time they give me a 50% answer, a kind of direction, a suggestion, a kind of coaching to the real answer. It is like the joke about the consultant; “the right answer must be in there somewhere, because he or she gives me so many responses”.
In spite of all kind of promises, search engines have not really increased their intelligence. Complex questions with multiple variables are still nearly impossible to get answered and the suggestions to improve my question are mostly about my spelling or because the search engine would have liked a different subject to be questioned on.
So nothing really good is coming from search engines then? Well most arguably search engines have brought us cloud computing and a very powerful access to lots and lots and lots of data, otherwise known as ‘the world wide web’.
No wonder I envision that powerful access and cloud computing are the two most important values we want to keep while increasing the capacity and intelligence to do real analytics on large data sets.
Data Analytics needs cloud computing to create an “Analytics as a Service” – model because that model addresses in the best way how people and organizations want to use analytics.
This Data Analytics as a Service – model (DAaaS) should not behave as an application, but it should be available as a platform for application development.
The first statement on the cloud computing needs suggests we can expect analytics to become easily deployed, widely accessible and not depending on deep investments by single organizations; ‘as a service’ implies relatively low cost and certainly a flexible usage model.
The second statement about the platform capability of data analytics however, has far reaching consequences for the way we implement and build the analytic capabilities for large data collections.
“Architecturally, and due to the intrinsic complexities of analytical processes, the implementation of DAaaS represents an important set of challenges, as it is more similar to a flexible Platform as a Service (PaaS) solution than a more “fixed” Software as a Service (SaaS) application”
It is relatively easy to implement a single application that will give you an answer to a complex question; many of the applications for mobile devices are built on this model (take for example the many applications for public transport departure, arrival times and connections).
This “1-application-1-question” approach is in my opinion not a sustainable business model for business environments; we need some kind of workbench and toolkit that is based on a stable and well defined service.
The white paper describes a proof of concept that has explored such an environment for re-usability, cloud aspects and flexibility. It also points to the technology used and how the technology can work together to create ‘Data Analytics as a Service’.