8 min read
Applications are being built today that are market makers, the most successful of which involve shifting technologies and business models. Salesforce.com is an outstanding example: it completely altered a well-established and understood market and business model by changing the underlying technology (multi-tenancy), introducing simplicity, and at the same time, it shifted the entire market from enterprise software licensing to subscription-based sales. This seismic shift caught major market leaders—Oracle, SAP, Microsoft—flat-footed!
The term “next-generation intelligent application,” like most terms in our industry, is used far too loosely and designed to obfuscate specifics, so part of what we’d like to do in this article is get specific.
From a technology perspective, next-generation applications have complex requirements. A confluence of several different technology shifts has dramatically changed the way that applications are being built. Growing data volumes and an increasing diversity of data sources is merely one aspect. The push toward real-time processing is another. At the same time, IT infrastructure is undergoing a total transformation; the growth of virtual machines, containers, scale-out server clusters, and cloud infrastructure has complicated the task of managing and moving data to make it available to applications quickly. Traditional databases and file systems won’t cut it anymore!
On top of all of that, new application development paradigms, like agile development and microservices, mean that applications are accessing and processing data in new ways. The promise of machine learning and artificial intelligence, evidenced by the announcement of Salesforce’s Einstein product, for example, is accelerating the adoption of new processing engines and software frameworks. Next-generation intelligent applications, then, are not only introducing new business models; they are also taking advantage of modern computational paradigms, powered by modern computational infrastructure.
Today, MapR customers are building next-generation applications on our Data Platform, the industry’s most scalable, agile, and comprehensive data platform, built to leverage these profound shifts in computing technologies in support of modern application requirements. Increasingly, our customers are also extending this capability to leverage data generated by the Internet of Things. As more and more devices and things are generating data and being connected to the internet, we have seen a substantial increase in computation that has to be carried out, not in the data center and not in the cloud but on the edge, close to where the sensors are operating in order to impact real-time processes. The new business models that are emerging from IoT applications are nothing short of completely disruptive. Connected car applications, including autonomous driving cars, will turn many industries—from transportation to logistics—on their heads. Industrial manufacturing processes enhanced by smart sensors and IoT are improving operational efficiencies. Oil and gas companies are using smart sensors on oil drilling rigs to improve production by leveraging sedimentary analysis data. From a data platform perspective, our customers need a seamless way of moving data and compute from the cloud to the edge and back again in a fluid way. A platform with these capabilities has the potential to support business models that previously had high capital requirements for entry to a much lower cost of entry. Consider, for example, the current move to a pay-for-use model in healthcare and heavy industry – that’s the next generation of application business models.
In short, next-generation applications have to be able to consume and digest data from a wide variety of sources—legacy applications, cloud solutions (like Salesforce), industry data marts, and a massive number of sensors—and act on them in real time. This requisite represents a major challenge as the traditional platforms—databases, data lakes, data warehouses—are so bloated and rigid that they cannot handle the massive volumes and agile data movement required. A new platform had to be delivered that could handle the volume, the scale, and the resiliency but, at the same time, would not repeat the mistakes of the past; it had to be designed for business continuity, automated failover, built-in disaster recovery, and checkpoint restart (in the event that there is a failure) – and that’s what we deliver with MapR!
As distributed computing becomes the norm, the movement of data between different physical systems becomes more important to overall application architecture. A single application could span multiple disparate data centers, including private data centers and one or more public cloud data centers (perhaps, even, from different cloud vendors). The picture becomes even more complex when an application also includes data movement between the Internet of Things and the data center. Today, a single logical application could involve moving data between upwards of thousands or even millions of physical systems! The data movement involved is often the bottleneck for the application, both from a runtime performance perspective and also from an application architecture design and implementation perspective. Getting the data movement right is important: it has to be fast, it has to be secure, it has to be seamless, it has to guarantee no data loss, it has to support omnidirectional data movement, it has to take advantage of different network connections and speeds, and it has to be simple from the perspective of the developer.
Our Converge-X Data Fabric provides a single, logical system for managing all your data. If you’re familiar with our platform, you know that MapR is the only platform that can manage any kind of data—including files, tables, and data streams—in a single platform with virtually unbounded scale and unmatched data access speed. Today, our native data movement capabilities—including streaming, mirroring, and replication—are providing incredible value to our customers who are integrating separate physical systems, whether two data centers on different continents or many MapR Edge clusters plus public cloud infrastructure. The simplicity of designing and implementing an application that requires complex data movement on MapR means faster time to value for our customers, who then enjoy the benefits of our platform at runtime, including robustness, security, and blazingly fast performance.
MapR put the traditional big data world on notice with our announcement of MapR Distributed File and Object Store (MapR XD). We’ve delivered–on top of the world’s most scalable platform–a true enterprise-ready solution that moves the game from picking a cloud platform to cloud neutrality and to the edge.
We’ll continue to focus our energies on our outstanding platform while, simultaneously, enhancing the ecosystem by engaging with technology partners and developers to help build the next generation of intelligent applications.
Stay ahead of the bleeding edge...get the best of Big Data in your inbox.