As In-Memory computing has grown in the minds of practitioners along with solution offerings from major players, analysts have been keeping a watchful eye on this emerging technology. And while the idea itself is not radically new, there’s debate now as to whether or not in-memory is ready for the mainstream. We sit down with someone reading the tea leaves to assess In-Memory’s level of adoption, its innovations, and use cases, its pitfalls and how it is likely to evolve.
Contributor
Download Podcast
Apple Podcast, Google Podcast, Spotify, Pandora, iHeartRadio, SoundCloud, TuneIn, and Stitcher. Find other syndication channels here or search CIO Talk Network podcast on any other app.
You Can also Listen to other latest and Best Big Data Podcasts
Summary
In-Memory Computing is an exciting new technology paradigm that can be potentially transformative across industries, and while there’s no truth to the perceptions that In-Memory is simply too expensive or that it can only provide incremental benefits, the reality is that the technology as it stands, along with the marketplace, is still somewhat fragmented.
There are six different technology classes for In-Memory computing, each with its own specific use case, but practitioners will want to be able to plug into their existing IT environment and have the standards be supported by their industry.
IMC becomes most relevant in terms of speed, scale and insight. The ability to move very quickly will be very critical in certain industries, but being able to serve large amounts of customers at once or to be able to analyze large amounts of data easily will also be benefits of In-Memory computing.
And while In-Memory has already been proven to solve problems for many companies, there’s an inherent misconception that In-Memory will allow you to simply plug and play to immediately generate faster insights. Replacing existing technology may only produce incremental value. It’s necessary to adjust your applications to take full advantage. You further need to supplement IMC with the appropriate skills of individuals who will be able to monitor and manage all the insights you are generating.
In-Memory Computing is no longer and immature technology. In many ways it is ready for prime-time and the major technology players have all embraced it to the point that IMC’s benefits and uses will soon become clear. And in several years time, the current standards challenges impacting the marketplace will disappear, the now theoretical benefits of In-Memory into concrete business benefits on a project-by-project basis, on a vertical-by-vertical basis and on a process-by-process basis will become more clear, and many packaged applications solutions will be natively built on top of In-Memory computing platforms. And its only a matter of time until truly innovative uses along horizontal lines are realized.
Don’t wait to begin embracing In-Memory Computing. Engage with business now to determine how the speed, scale and insight created by In-Memory can create a competitive advantage.
Transcript
Sanjog Aul: Welcome listeners, this Sanjog Aul your host, and the topic for today’s conversation is “Is In-Memory Computing Ready for the Mainstream?” And I have with me Massimo Pezzini. Massimo is a Vice President and Fellow with Gartner Research. Hello Massimo, thank you for joining us. Now, we’ve heard a lot about In-Memory computing lately, and it’s become cheaper. And even though it’s not a new idea, it hasn’t fully hit the mainstream just yet. So we wanted to explore with Massimo, what are some of the feasible solutions available, what the market is like, and what has been leading the way with this technology. So, the first question for you Massimo is what would an industry leader come to expect in a good In-Memory Computing solution?
Massimo Pezzini: Well, for this stage unfortunately, there isn’t such a thing in the sense that the InMemory computing technology space is still quite fragmented. We have identified at least six different classes of technology, each and every one specific to certain use cases. But fundamentally, apart from the individual specific attributes that you expect from each and every one of these six technology classes, what you want to select as an In-Memory computing enabling technology include the ability to plug into your enterprise’s established IT environment. In Memory Computing doesn’t come out of this blue sky of course, it needs integrated into your existing IT environment. It must be able to support standards. What you’re looking for is technology that is supported by the industry.
“There is a perception in the industry that IMC is expensive; this is not necessarily the case. IMC doesn’t necessarily need to cost a fortune.”
Skill availability is an issue in In-Memory computing and therefore you want to buy a technology for which there is an available set of services instead and an offering from system integrators and so on. You want to select a technology which is supported by third parties, for example, independent software vendors and system integrators and so on. And of course, cost is an issue. There is a perception in the industry that IMC is expensive; this is not necessarily the case. IMC doesn’t necessarily need to cost a fortune.
Sanjog: What problems do you think it’s expected to solve? What are we expecting for it to solve for an organization, and how do you think it’s going to affect their day-to-day business?
Massimo: Well, generally speaking, IMC is relevant when it comes to solving three fundamental type of problems; speed, scale and insight. There are business processes for which time is very critical, think about for example, financial trading, right? The difference between making millions of dollars or losing millions of dollars is a matter of microseconds. So anytime you have a speed problem, IMC can be very helpful.
“IMC is relevant when it comes to solving three fundamental type of problems; speed, scale and insight.”
Scale is another type of issue, for example, in global businesses like E-commerce for example or global online gaming, speed is important, but even more important is the ability to serve hundreds of thousands, if not millions, if not tens of millions of users utilizing your systems, and IMC can really help a lot in this respect.
And finally is insight. Of course In-Memory computing is about processing data very, very quickly. Let’s say several orders of magnitude faster than by using traditional spinning disc type of technologies. And this gives you an opportunity to analyze a lot of data in a very, very short period of time. You can run business intelligence reports that will take hours using a traditional technology in seconds. And this gives you of course an unprecedented level of detail into your business and this helps you collecting insights that will be almost impossible to do with the traditional technologies.
Sanjog: For those who actually have begun to adapt In-Memory into their organization, how encouraging have the results been? And what problems are they facing or not facing, because there is a lot of hype that this is going to cause problems or that it’s not going to cause problems.
Massimo: Well of course, this is a technology. This software, this is hardware, and I’m not aware of any hardware and software which is not causing problems, so there are indeed problems. But in terms of benefits, we have both incremental and transformative benefits that organizations can experience through In-Memory computing technology. For example, we are seeing organizations running their business intelligence reports in a fraction of the time it will take them to process using traditional technology. One episode is an energy company here in Europe.
Using traditional technology, it would take them something like 12 hours to populate their data warehouse to create a whole bunch of reports that their users they need every day in order to plan operations for retail and other types of processes. By using In-Memory computing technology they have been able to reduce this time from 12 hours to two hours, meaning that now you know people when they get to office they have their reports available and they can start working immediately.
“You can run business intelligence reports that will take hours using a traditional technology in seconds. And this gives you of course an unprecedented level of detail into your business and this helps you collecting insights that will be almost impossible to do with the traditional technologies.”
But, we also see transformative utilization of In-Memory computing. Another great example that I have is a bank in Sweden called Avanza Bank. This company has rebuilt completely from scratch their core banking system and trading system. So let me repeat: core banking system and trade system on top on In-Memory architecture. This is giving them a lot of benefits from a business point of view. For example, now they’re able to release new products and services to their customers in a matter of literally a few days. They release a new version of their application every eight working days for example.
The use of In-Memory computing has helped them solving once and for all the scalability problem. Now they can run an incredible amount of transactions per second that would be impossible to run using their previous architecture. And finally, it is helping them introduce product innovation, for example, they’re able to calculate the risk profile of their customers in real time, allowing them to offer to the client that they’re very, very specialized, very specific and very customized terms and conditions, and this is helping them winning new clients. They are gaining several thousands of new clients every week because of this innovation they have introduced.
Sanjog: So, would you say in summary, would you say for the most part it is meeting or exceeding expectations or are there some specific areas where it is not?
Massimo: Well, the problem associated with In-Memory computing is that it’s not plug and play. The rhetoric suggests that you just replace your traditional database with the In-Memory computing technology, and all of sudden automatically the application will run 1000 times faster; that’s not really the case. In many cases, by simply swapping out traditional technology and replacing it with In-Memory, you see marginal improvement, in some cases no improvement at all. This is because you most of the time need to actually rework and adjust your application to take full advantage of In-Memory computing technology.
“The problem associated with In-Memory computing is that it’s not plug and play. The rhetoric suggests that you just replace your traditional database with the In-Memory computing technology, and all of sudden automatically the application will run 1000 times faster; that’s not really the case.”
As I was already saying earlier skill is an issue, so, you can even buy the best technology, In-Memory computing technology in the planet. But if you don’t have enough skills, if you don’t have the experience and the expertise and if you cannot find anybody helping you, the benefits are simply not going to happen. Monitoring and management of In-Memory computing environment is also challenging. You may have hundreds of servers storing your data In Memory and managing what’s going on is really a challenge. So, I would say that from a business point of view, from a business benefit point of view, most of the cases, I have seen an organization actually meeting their business goals but not as easily and cheaply as they would expect.
Sanjog: What is the specific need that the marketplace is driven by? Are the companies using Big Data and Real Time Analytics as an area of innovation most profiting from using In-Memory Computing?
Massimo: Well for sure Big Data & Analytics, real time analytics are powerful drivers for the IMC. As we said, we have seen organizations dramatically improving the performance of their business intelligence applications by leveraging In-Memory computing, but we do see quite a lot of use of In-Memory also for transaction processing applications, for example E-commerce, travel reservation, online gaming, internet banking or online trading. Those are all transaction processing applications that are benefitting a lot from the use of In-Memory computing. Algorithmic Trading is an industry that wouldn’t exist without In-Memory, very much like for example Cloud. Cloud software as a service is very much enabled by InMemory computing.
“If you don’t have enough skills, if you don’t have the experience and the expertise and if you cannot find anybody helping you, the benefits are simply not going to happen.”
But I believe the breakthrough innovation driven by In-Memory will come from what we call hybrid transaction analytical processing. So the ability to inject real time business intelligence, real time analytics and predictive analytics in the context of transaction or business processes, for example in order to cash process or the ability to run what-if analysis against a live data and not against one week old or one month old data to decide whether to fulfill the order or sending to the client system, products or other products, on the basis of consideration such as the customer satisfaction of profitability, those are all things that are enabled by IMC and that will for sure create a breakthrough innovation in the industry.
“I have seen an organization actually meeting their business goals but not as easily and cheaply as they would expect.”
Sanjog: What do you think would be a strong vendor solution and what kind of attributes do you think a buyer should look for? In today’s day and age, do you think the available vendor solutions are really feasible and are ready to be fully utilized or ready to create the full value that anyone would look for?
Massimo: Well, there is an assumption in the industry that In-Memory computing is an immature technology; this is not really the case. In-Memory has been around for 10+ years; some products came to the market in the late 90s. So technical maturity is not really an issue. What is an issue is more so the ability of the vendors to translate the theoretical benefits of In-Memory into concrete business benefits on a project-by-project basis, on vertical-by vertical basis and on a process-by process basis, but also the skills problems which we already eluded to and the technical fragmentation. Today it is difficult to sometimes to understand whether the best fit for a specific problem is In-Memory data grid, an In-Memory database, or a complex and processing technology. So, those are I would say the challenges that customers are facing, but from a technology maturity perspective, I believe that there are a lot of products out there that are ready for prime time.
“There is an assumption in the industry that In-Memory computing is an immature technology; this is not really the case… I believe that there are a lot of products out there that are ready for prime time.”
Sanjog: How is an In-Memory computing solution getting integrated into the whole suite of business functions as opposed to becoming just another piece in the IT tool box?
Massimo: Well today it’s primarily yet another piece in the tool box, but we do see packaged applications, and there is more and more integrated In-Memory computing in their offerings. In some segments, for example the supply chain management, you have in the, in the market already a good number of packaged application which are you know natively taking advantage of In-Memory to offer superior capabilities and innovations to there customers. E-commerce is another area where we do see a lot In-Memory technology being used by vendors. Core center of operations is another one.
“We’ll see over the next three to five years a large proportion of packaged applications solutions will be natively built on top of InMemory computing platforms.”
And the most important, I would say, software vendors are working as we speak in order to integrate In-Memory computing technology in, in areas like Financials, HR, CRM and other horizontal types of solutions. So today for the most part organizations that want to leverage In-Memory computing need to let’s say build their own custom solution on top of In-Memory, but we’ll see over the next three to five years a large proportion of packaged applications solutions will be natively built on top of In-Memory computing platforms.
Sanjog: People were saying DRAM was basically one of the show stoppers where the cost of DRAM was so high that people were not adopting it to the degree you would have expected. Is that all being taken care of or do you think we are beyond that part now?
Massimo: Well I would say that definitely the cost of DRAM Technology was an obstacle up until, lets say five or six years ago. And undoubtedly the ever dropping cost of hardware and other technologies specifically DRAM for sure is an enabler for InMemory computing. But there’re other few aspects that need to be taken into consideration. Well first and foremost, In-Memory computing technology is quite complex. It’s rocket science technology. It took us quite a long time to mature to the point of becoming digestible, let’s say by mainstream organizations. And moreover as it often happens, IMC was for quite some time a solution looking for a problem, if you see what I mean. In other words, the theoretical benefits were clear and they had to do with speed, scale, insight, etc., but it took a while for the
industry to find out the killer application for IMC. We mentioned a few of them already like supply chain management and supply chain planning, business intelligence, E-commerce and so on and so forth.
“This process of identifying the killer application for InMemory is still in progress, and we are going to see a lot of innovations in this space.”
But this process of identifying the killer application for In-Memory is still in progress, and we are going to see a lot of innovations in this space. By the way, we believe that In-Memory is going to be a disruptive force in the package application business. So we are going to see a lot of changes and possibly new vendors emerging because of that.
“A lack of standards is not only an impediment to adoption on the part of either organization, but also on the part of independent software vendors or software as a service and service providers investment.”
And finally, there is another factor that is that up until recently IMC was too small a market to be of any interest for the last software powerhouses. Only small companies had In-Memory computing technology in the market and, they only had very few marketing dollars to be spent in order to educate the market. And those are all those factors that are going away as the large vendors are entering the market as the technology matures. So people like SAP, HP, Microsoft, IBM, Oracle and others are now fully committed to IMC. So this means that the marketing dollars for educating the market are now available, so the situation is going to change quite rapidly. I would say that IMC is for the most part ready for prime time.
Sanjog: In terms of standards, when other technology starts, there are no specific standards that everybody adheres to, and it becomes a market share creation. Are we beyond that point with IMC where everyone has one set of standards that are driving all the innovations and evolutions of IMC?
Massimo: No, we are far from there. You are absolutely 100 percent correct. A lack of standards is a problem and it is one of the main obstacles to IMC adoption. The varied standards and even more importantly as we were saying products, categories and classes, they overlap partially, so sometimes it’s really very hard for customers to really figure out you know which piece of technology they really need and which they don’t. Moreover a lack of standards is not only an impediment to adoption on the part of either organization, but also on the part of independent software vendors or software as a service and service providers’ investment, so which is of course a problem.
There are a couple of other standardization efforts going on, but there are very minor; they address all these small portions of the overall IMC architecture. However, there is good news from this point of view in the sense that IMC technologies are more and more incorporating pre-IMC standards, very popular standards like SQL for example or REST or JSON or JDBC and JPA, which is going to make the transition from traditional technology to In-Memory both from SQL perspective and application perspective make it easier. For example in many cases we have seen organization being able to migrate their pre-existing established applications to IMC through the application of the standard. However I believe we are just going take another three to five years for the technology convergence to occur and for a set of IMC standard to emerge. If I had to point to the main obstacle to IMC adoption, I would say definitely skills, and number two, a lack of standards, no question about that.
Sanjog: So in terms of the service and blades that were not being built exclusively for InMemory computing, does that pose any challenge to the organization? Or because we are taking pre-IMC standards, are they also being adopted for the hardware as well?
Massimo: Well I would say so far it is not an obstacle at all in the sense that we see organizations using commodity service simply to run the IMC application. In some cases there’re issues associated with disaster recovery and high availability that need to be addressed by the hardware vendors. But this is not really a showstopper; it is only making things a little bit more complicated. However I believe that over the next few years we are going to see radical changes also in the hardware architectures.
We are beginning to see hardware vendors designing hardware platforms specifically for supporting IMC technology. This is going to help further reducing the cost and it’s going to help improving the manageability and the IQ operations of the In-Memory computing architectures. Consider that today in order to deploy an application holding maybe a couple of terabytes of data in memory, which is pretty common nowadays, you need to lay down hundreds of servers, which creates of course a lot of IT operation issues. So with new factors and new hardware architectures, this is going to be reduced dramatically and make everybody’s life easier from an IT operation standpoint.
Sanjog: Are there any specific industries that you think are better suited for IMC, or is this going to become ubiquitous?
Massimo: Well, traditionally telecommunications, financial services and the internet industry have been the pioneer in In-Memory computing technology because they have very special types of problems and speed in many of these industries is critical. And moreover those types of organization typically have the skills and the resources to invest in new technology. So they have been backing this technology for several years, but it is not ready for mainstream.
“If your competitor is able to re-plan its supply chain in matter of seconds and it takes you hours, then you have a problem from a competitive standpoint…If your competitors are able to dynamically change pricing of their products in a real time fashion and instead you are only able to change your prices once per day, then you have a competitive problem.”
Travel and transportation for example are industries widely adopting this type of technology. Retail is another industry that is investing quite significantly in InMemory, as are energy, utilities, and manufacturing. So we are beginning to see the use of this type of technology pretty much across the board. But I believe that when In-Memory computing enabled package application solutions such as ERP and CRM will hit the market, this technology will become ubiquitous. For example, if your competitor is able to re-plan its supply chain in matter of seconds and it takes you hours, then you have a problem from a competitive standpoint. You are very likely to jump on In-Memory to regain parity with your competitors.
If your competitors are able to dynamically change pricing of their products in a real time fashion and instead you are only able to change your prices once per day, then you have a competitive problem, so you are likely to jump into In-Memory computing as well. So I would say that in the five to 10 years now In-Memory would be everywhere. And to a certain extent this is already the case because you know people using Facebook or Twitter are already using In Memory computing. They don’t know because they are under the covers but this type of application this type of system is fundamentally enabled by In-Memory.
Sanjog: So how is this In-Memory computing paradigm expected to evolve in your view? What do you think would be your advice in terms of what due diligence they should perform as industrial leaders or what preparation they should do for their organization, so that they are able to make the most use of this exciting new technology?
Massimo: This is a very fundamental question, and thanks for asking it. In terms of how the technology is going to evolve, we expect further adoption in the cost of core technology, for example historically, the production cost for DRAM has been dropping by something like 3 percent every 18 months, and we don’t see any sign that this is going to slow down. So you can expect that core technology will continue to drop. From a more software point of view, I think that we are going to see a convergence of these many different types of technology, In-Memory database, In-Memory data grids, compulsive and processing platforms In-Memory analytics and so on, to converge into what we call the Unified In-Memory Computing Platform. This is going to be good for the industry by reducing technology fragmentation and probably pushing towards cohesive set of standards. We are going to see commodity IMC specific hardware platforms, as we were saying earlier. Those are trends that we definitely believe are going to take place over the next few years.
“My first recommendation is don’t wait. There is no reason for you to wait another three years for you to jump into In-Memory… IMC is potentially transformative. So a very important thing that industries should do is to brace with the business leaders of their companies and try to figure out how the speed, scale and insight enabled by IMC can help their companies build the differentiation and competitive advantage.”
In terms of recommendations for industry leaders, my first recommendation is don’t wait. There is no reason for you to wait another three years for you to jump into In-Memory. There are very specific low hanging fruits that you can take advantage of in order to experiment with In-Memory computing. Understand what are the business and technical benefits and also drawbacks associated with IMC and start building your scales. But considering IMC only as an incremental technology providing incremental benefits, that would be a mistake. As we discussed during this conversation, IMC is potentially transformative. So a very important thing that industries should do is to brace with the business leaders of their companies and try to figure out how the speed, scale and insight enabled by IMC can help their companies build the differentiation and competitive advantage.
Explore More
-
- Free Download: Pros and Cons of In-Memory Computing
- The 10 Biggest Questions About In-Memory Computing
- Pros and Cons: In-Memory Computing
- Challenges of Big Data
- Shifting from Digital New to Now
- Making Sense of Big Data
- 10 Best All-time Cloud Computing Talks
- The exciting and misunderstood paradigm of ‘In-Memory Computing’