MindSphere is missing from Gartner’s Magic Quadrant – is it a big deal? (answer: probably not)

Platforms
This Photo by Unknown Author is licensed under CC BY-SA

Analyst firm Gartner published their list of Industrial IoT Platform leaders in October 2020. Notably Siemens MindSphere is missing from the list. The Gartner Magic Quadrant ™ however lists Hitachi, PTC and Microsoft as leaders and Software AG as a ‘visionary’.

In contrast, in February 2020, analyst firm Forrester published their Forrester Wave ™ with leading Industrial IoT platforms that includes Siemens, together with PTC, Microsoft and C3.ai.

If we look further in both reports, there are also other differences in the ranking such as IBM being a ‘contender’ in Gartner’s view and a ‘strong performer’ in the view of Forrester.

2020 Forrester Wave (TM) - Industrial IoT Platforms
2020 Forrester Wave (TM) – Industrial IoT Platforms
2020 Gartner Magic Quadrant (TM) - Industrial IoT Platforms
2020 Gartner Magic Quadrant (TM) – Industrial IoT Platforms

Analyst firms

Obviously both analyst firm use their own set of criteria and methods to rank the providers of IoT platforms. It is also not uncommon for providers to work with a limited set of analyst firms, so it could be the case that Siemens decided not to participate in Gartner’s research in 2020.

Participating in analyst firms research is a well-established way to create independent evidence how a solution or product holds up compared to its competitors. In general, the ranking is based on a combination of product related questions, customer references and a product demonstration to the analyst reviewing team. The amount of work for the providers should not be underestimated; there is a significant investment in time needed to address the many questions and solid customer references. I have done it multiple times in the past and can testify that it is an intense process.

Relevance

However, if we look at the topic of this Magic Quadrant ™ and Forrester Wave ™ I want to raise an important question: what is the relevance of researching a platform?

Comparing IoT platforms with each other is a bit like comparing Linux to Windows or macOS. These operating systems exist next to each other and a choice for one or the other is not necessarily based on their underlying functionalities. The real value is created by choosing the available applications, scalability and developer ecosystem. In my opinion the same applies to (industrial) IoT Platforms.

A choice of IoT platform should be based on these 3 elements, combined with the amount of access to trained staff and product specialists for implementation and maintenance.

Concluding

Finally I would maintain the opinion that standardization within the company is an important aspect. Choosing one platform and sticking with it allows for scalability and efficiency, two cornerstones of any IoT implementation.

Based on this there probably is not a ‘best’ platform, there are however the best use cases for a particular organization. And those use cases are the applications that run on the platform. Just like Microsoft Office runs on Windows, applications for preventive maintenance or digital twins are the right software to focus on in industrial IoT.

I am looking forward to the analyst firms to compare the IoT applications and create a list of leaders in use cases. Comparing Operating Systems feels like the wrong investment in time and effort for both the analyst and provider teams.

The journey to managed enterprise IoT – Part 2 – Enabling the use case

<This blog was previously published at the Atos Thought Leadership website – it was written by Philip Griffiths>

In my previous blog, I shared our overview of the journey to managed enterprise IoT, which we divide into three levels of maturity. Here I’ll explain the first level: Enabling the use case. For each company to transform data – the world’s most valuable resource – into business outcomes, they first need to work out how data and IoT will improve the business. This could be by enhancing customer experience and improving your internal organization. Once the strategy is picked, a use case can be developed and tested with speed and agility to measure the outcome and validate its value. This is ‘enabling the use case’.

By taking advantage of data, companies can deliver a multitude of benefits by delivering IoT projects. Examples include:

  • Customer Experience: Using data to understand market demands, behavior and buying trends and develop new products and services your customer’s will love to use
  • Business Reinvention: Market agility with new business models, products, services and revenue streams
  • Operational excellence: Gain efficiency and agility with data-driven business processes
  • Trust & Compliance: Unleash the power of analytics to protect your assets

To realize these benefits, 3 key activities need to be developed:

  1. Strategy & Ideation(S&I): Explore market changes, customer needs, business problems, opportunities and available data to select IoT use cases that enable data insights and sustainable business value. This includes identifying the business processes, any applicable standards and users.
  2. Proof of Value (PoV): Execute rapid prototyping to test and prove that the use case delivers value – the best approach is a limited scope and time frame. If it holds value, you should also develop a high level architecture for the future.
  3. Business Case (BC): Define the BC for the next steps – including investment costs and a genuine ROI – and business model that will be used while actively looking ‘beyond the use case’ – see next blog piece.

Across each of these topics, agility and an exploratory nature are critical. We expect the use case enablement to take anywhere between 2-4 months from the initial workshop, detailed study and developing PoV with BC – if it is then rolled out as a project this could take another 6-12 months. There should be a few deliverables (non-exhaustive list):

  • Strategy & Ideation: Data strategy, process scoping, value assessment, connectivity definition, high level plan for PoV and BC, management presentation.
  • Proof of Value: Requirements, built and tested PoV with results, use case feasibility report, project solution architecture, management presentation.
  • Business Case: Strategy map, benefits profile (with KPIs), project costing, business case, investment performance analysis, project plan, management presentation.

The first step in getting more value from your business data is to brainstorm and assess the IoT opportunities that could enable real business benefits. In starting the journey you take one step closer to delivering internal and external changes within your organization based upon the data you already have. I recommend holding a discovery workshop to identify the benefits you will enable and your next steps. Sounds pretty simple and straightforward, doesn’t it?

My customers often find that some data is more valuable than others. Out of the hundreds or thousands of data points that could be gathered, only a small handful will give the highest likelihood of accurately determining a business outcome (following an 80/20 rule). Clustering, co-occurrence and classification analysis techniques can help you to determine which data points produce the greatest value and therefore what you should focus on.

Check out my next blog where I’ll address the 2nd maturity level: ‘Beyond the use case’.

I would like to add a special thanks to Philip Griffiths (@ThePGriffiths). Philip was until recently, the strategic partner manager for the IoT practice and took the initiative to write this blog-series that you are reading now.

IPv6 – your next cash cow?

For anybody looking at the next big thing, the new 'killer app' or the new gold, I recommend to read a white paper by the Atos Scientific Community called "IPv6: How Soon is Now?".

The paper explains very well the problem with the way the internet is currently working. It points out that we have a serious issue, a 'time-bomb', with the way that devices (computers, networking components and other IT stuff) are connected with each other using this old IPv4 technology. The paper further explains why, in spite of all kinds of intermediate technologies, we need to adopt a new technology, called IPv6, and we need to do that very quickly.

"To sustain the expected Internet growth, there is no adequate alternative to adopting IPv6."

Furthermore you will read in the paper that we will be running into real problems if we do not make that change and unfortunately the change is happening much too slow.

"Unfortunately statistics from Google paint a (…) picture with less than 1% of total users being IPv6 clients"

This might sound awfully boring and a field of play for the technology wizards in your organizations – this is not for you right? But wait, because halfway through the paper, the authors start explaining that the benefit of this new technology is in the way it can support all possible technical devices (including cars, phones, traffic lights, wind mills, televisions, your shoes and wrist watch, medical devices and almost anything) can become connected – can talk with each other – when we switch to IPv6.

"(…) that IPv6 can now be used on virtually any communicating object, from servers to small sensors, regardless of the underlying (…) network technology."

I think this changes everything; it opens up a whole new world of play for consumers and manufacturers, for service providers and retailers; to create new businesses, to open up new markets and create new ways of making money.

"The IPv6 "Killer App" is likely to be the enablement of the Internet of Things (IoT)"

Based on this you would be stupid to not support this move to IPv6; it will be the engine that allows your business to innovate and grow; your IT landscape will increase thousand fold and you can bring any type of information, sensor or other device into your business platform. That is cool and exciting.

But it will not be easy.

"Although many people think that a migration to IPv6 is primarily a networking issue, the truth is that all IT organizations across server, network, storage and application domains must be equally trained to contribute to both the planning and execution."

The authors explain in quite some detail that you will need to overcome technical hurdles (IP Space Management, IP Address Provisioning, IPv6 to IPv4 interoperability, Application IPv6 readiness and Security Challenges) as well as business challenges (Coordination across silos and companies, Timing issues on what to do first and governance to establish End-to-end responsibility).

"We predict a tipping point when there will be more IPv6-connected users and devices, and therefore opportunity, than the IPv4 landscape provides today."

So, want to grow your business, do the strategically right thing and set yourself up for business growth, agility and all the other stuff you need and like? Migrate to IPv6 now.


This blog post was previously published at http://blog.atos.net/blog/2013/09/03/watch-this-space-lucky-7-avoiding-information-overload/

Would you like a cup of IT

The change in the IT landscape brought about through the introduction of Cloud Computing is now driving a next generation of IT enablement. You might call it Cloud 2.0, but the term 'Liquid IT' much better covers what is being developed.

In a recently published white paper by the Atos Scientific Community, Liquid IT is positioned not only as a technology or architecture; it is also very much focused on the results of this change on the business you are doing day to day with your customer(s).

"A journey towards Liquid IT is actually rather subtle, and it is much more than a technology journey"

The paper explains in detail how the introduction of more flexible IT provisioning, now done in real time allows for financial transparency and agility. A zero latency provisioning and decommissioning model, complete with genuine utility pricing based on actual resources consumed, enables us to drive the optimal blend of minimizing cost and maximizing agility. Right-sizing capabilities and capacity all of the time to the needs of the users will impact your customer relationship – but, very important, designing such a systems starts with understanding the business needs.

"Liquid IT starts from the business needs: speed, savings, flexibility, and ease of use"

Existing examples of extreme flexibility in IT (think gMail, Hotmail or other consumer oriented cloud offerings) have had to balance between standardization and scale. The more standard the offering, the more results in scaling can be achieved. This has always been a difficult scenario for more business oriented applications. The paper postulates that with proper care for business needs and the right architecture, similar flexibility is achievable for business processes.

Such a journey to 'Liquid IT' indeed includes tough choices in technology and organization, but also forces the providers of such an environment to have an in-depth look at the financial drivers in the IT provisioning and the IT consumption landscape.

"The objectives of financial transparency dictate that all IT services are associated with agreed processes for allocation, charging and invoicing"

There are two other aspects that need to change in parallel with this move to more agility in IT; the role of the CIO will evolve and the SLA that he is either buying or selling will change accordingly.

Change management will transform into Information Management as the use of IT as a business enabler is no longer the concern of the CIO. IT benchmarking will become a more and more important tool to measure the level of achieved agility for the business owners. The focus on the contribution to the business performance will be measured and needs to be managed in line with business forecasts.

The white paper authors conclude that "Business agility is the main result of Liquid IT" – sounds like a plan!

This blog post was previously published at http://blog.atos.net/blog/2013/03/08/watch-this-space-would-you-like-a-cup-of-it/


 

Three reasons to change the Internet now

Times are changing and we all need to adapt. The internet has had a major impact on all of our lives and continues to be a growing force in all aspects of society; in personal interactions, in knowledge management and inthe way we do business.

In a whitepaper by the Atos Scientific Community, this evolution of ‘the net’ is described and put in the context of the additional functionality we now expect from our interactions on the internet. The authors challenge the current technology stack that is making up the many, many connections and network capabilities that have to be served to make the internet do what it is supposed to do.

The topology of the Internet has evolved through economic and technological optimization decisions to a flatter structure where major content providers and distributors get as close as possible to the access networks used by their customers

There seem to be good reasons to have a good look at this technology evolution and make some choices to  continue to enjoy the internet:

  1. Because of the cloud computing trend, more and more traffic is concentrated between several internet powerhouses; Facebook, Amazon, Google and Microsoft. The distributed nature of the original internet simply does not exist anymore.
  2. Because of the huge increase in mobile internet usage, the way that information is accessed, changed and presented is different from the past models – the existing networking functionality is not optimized for this type of usage.
  3. Future scenarios predict that through the assignment of an IP address to about any device you can think of we will create a huge peer-to-peer network, where human interaction will be only a small portion of all connections; “the internet of things”. The current internet technology is not designed for this.

These changes raise some fundamental questions and these are described in more details the paper. Most noticeable the authors bring our attention to the fundamental nature of the internet as it is built at the moment, a decentralized web of processing and access points.

On the long run, the question is raised whether the Internet will durably follow a concentration trend driving it towards a more centralized network or if we will see a new wave of decentralization.”

The whitepaper  dives into the technology of the internet and shows where we are facing potential bottlenecks. 


[This blog post is a repost of http://blog.atos.net/blog/2012/12/03/watch-this-space-three-reasons-to-change-the-internet-now/ ]