Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels

Channel Catalog

Channel Description:

... blogging on what is happening in enterprise software, with a focus on Future of Work and Next Generation Applications, sparkled with occasional musings on the the state of the industry and outlooks where we are heading.

older | 1 | .... | 14 | 15 | (Page 16) | 17 | 18 | .... | 31 | newer

    0 0

    The need to manage private clouds seems to be really heating up on the demand side – last week EMC announced plans to acquire Virtustream (see analysis here), today both IBM and Cisco announced the respective purchases / intent to purchase of BlueBox and PistonCloud. 

    So let’s dissect the IBM press release in our custom style (it can be found here):

    ARMONK, N.Y - 03 Jun 2015: IBM (NYSE: IBM) today announced it has acquired Blue Box Group, Inc., a managed private cloud provider built on OpenStack.
    Blue Box is a privately held company based in Seattle that provides businesses with a simple, private cloud as a service platform, based on OpenStack. Customers benefit from the ability to more easily deploy workloads across hybrid cloud environments. Financial details were not disclosed.
    MyPOV – Another acquisition in Seattl, Blue Box originally started as a website hosting provider and has evolved to a OpenStack managed private cloud provider.

    Enterprises are seeking ways to embrace all types of cloud to address a wide range of workloads. Today’s announcement reinforces IBM’s commitment to deliver flexible cloud computing models that make it easier for customers to move to data and applications across clouds and meets their needs across public, private and hybrid cloud environments. With Gartner forecasting that 72 percent of enterprises will be pursuing a hybrid cloud strategy this year [1], it is increasingly important for companies to leverage multiple models while maintaining consistent management across their cloud platforms.
    MyPOV – For the longest time the race was about getting public cloud capabilities going. It looks like by mid of 2015 enterprises are ready to move to cloud, albeit in a more conservative fashion, with more private cloud aspects than ever before. With the bare metal of SoftLayer IBM is able to give customers more confidence levels than most competitors, but it seems that this was not enough, hence the Blue Box acquisition.

    Through Blue Box, IBM will help businesses rapidly integrate their cloud-based applications and on-premises systems into OpenStack-based managed cloud. Blue Box also strengthens IBM Cloud’s existing OpenStack portfolio, with the introduction of a remotely managed OpenStack offering to provide clients with a local cloud and increased visibility, control and security.
    MyPOV – This paragraph unveils the ‘crown jewels’ – the capability to manage a Openstack system deployed remotely. Enterprises may still want to utilize their data centers and see their servers, but are more open to have them managed remotely.

    This move further accelerates IBM’s commitment to open technologies and OpenStack. IBM has 500 developers dedicated to working on open cloud projects to bring new cloud innovations to market. With Forrester Research recently finding that more than twice as many firms use or plan to use IBM Cloud as their primary hosted private cloud platform than the next closest vendor [2], Blue Box is a strategic fit into the IBM Cloud portfolio.
    MyPOV – No surprise – OpenStack compatibility is important and makes Blue Box a good fit.

    Blue Box can enhance and complement developer productivity by:

    · Speeding delivery of applications and data through simplified and consistent access to public, dedicated and local cloud infrastructures

    · Supporting managed infrastructure services across hybrid cloud environments and IBM’s digital innovation platform, Bluemix

    · Offering a single management tool for OpenStack-based private clouds regardless of location

    MyPOV – Good summary of the BlueBox capabilities. Blue Box partnered with small PaaS vendor Mendix, the mention here of Bluemix may make those deployments uncertain – but we are not aware of a statement in these regards. In our view it maybe well worth for IBM to look at the Mendix capabilities.

    This acquisition will enable IBM to deliver a public cloud-like experience within the client’s own data center, relieving organizations of the burden of traditional private cloud deployments.
    MyPOV – Underlines the value prop – the customer keeps the data center the management goes to IBM. The acquisition reminds me of Cisco’s recent acquisition of MetaCloud (see here), which also had a significant service aspect.

    “IBM is dedicated to helping our clients migrate to the cloud in an open, secure, data rich environment that meet their current and future business needs,” said IBM General Manager of Cloud Services Jim Comfort. “The acquisition of Blue Box accelerates IBM’s open cloud strategy making it easier for our clients to move to data and applications across clouds and adopt hybrid cloud environments."

    “No brand is more respected in IT than IBM,” said Blue Box Founder and CTO Jesse Proudman. “Blue Box is building a similarly respected brand in OpenStack. Together, we will deliver the technology and products businesses need to give their application developers an agile, responsive infrastructure across public and private clouds. This acquisition signals the beginning of new OpenStack options delivered by IBM. Now is the time to arm customers with more efficient development, delivery and lower cost solutions than they've seen thus far in the market.”

    MyPOV – The usual quotes – no comment needed.

    IBM currently plans to continue to support Blue Box clients and enhance their technologies while allowing these organizations to take advantage of the broader IBM portfolio. […]
    MyPOV – Key statement. Blue Box customers should contact IBM asap to make sure they can keep the services that matter to them.

    Overall MyPOV

    A good move for IBM, opens new service offerings to the IBM Cloud portfolio. Given the recent acquisition at EMC and Cisco, it looks like the private cloud is alive and well. To a certain point that is a failure of ‘pure’ public cloud players like AWS and Google, it looks like they have not been able to convince CIOs to move all their load to a public cloud setup. Will be interesting how the mix of private cloud – e.g. the ones managed with BlueBox – vs public cloud will end up in a few years. And IBM listens to customers, wants more cloud revenue, so as much as it can get that running private clouds, IBM will of course do that…

    More on IBM :
    • Event Report - IBM InterConnect - IBM makes bets for the hybrid cloud - read here
    • First Take - IBM InterConnect Day #1 Keynote - BlueMix, SoftLayer and Watson - read here
    • News Analysis - IBM had a very good year in the cloud - 2015 will be key - read here
    • Event Report - IBM Insight 2014 - Is it all coming together for IBM in 2015? Or not? 
    • First Take - Top 3 Takeaways from IBM Insight Day 1 Keynote - read here
    • IBM and SAP partner for cloud - good move - read here
    • Event Report - IBM Enterprise - A lot of value for existing customers, but can IBM attract net new customers? Read here
    • Progress Report - The Mainframe is alive and kicking - but there is more in IBM STG - read here
    • News Analysis - IBM and Intel partner to make the cloud more secure - read here
    • Progress Report - IBM BigData an Analytics have a lot of potential - time to show it - read here
    • Event Report - What a difference a year makes - and off to a good start - read here
    • First Take - 3 Key Takeaways from IBM's Impact Conference - Day 1 Keynote - read here
    • Another week and another Billion - this week it's a BlueMix Paas - read here
    • First take - IBM makes Connection - introduces the TalentSuite at IBM Connect - read here
    • IBM kicks of cloud data center race in 2014 - read here
    • First Take - IBM Software Group's Analyst Insights - read here
    • Are we witnessing one of the largest cloud moves - so far? Read here
    • Why IBM acquired Softlayer - read here
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    It seems to be the weeks in which the tech giants are discover the appeal of managed cloud services – last week EMC acquired Virtustream (my take here), this week IBM acquired Blue Box (my take here) and Cisco announced to acquire Piston Cloud. 

    So let’s dissect the Cisco blog (it can be found here) in our custom commentary style:

    Cloud computing has fundamentally altered the IT landscape: dramatically boosting IT agility, while lowering costs. To realize the business advantages of cloud, organizations are shifting to a hybrid IT model—blending private cloud, public cloud, and on-premise applications.

    MyPOV – It looks like the demand for hybrid cloud has gotten much stronger, considering this announcement.

    To help customers maintain control and compliance in this hyper-connected, hyper-distributed IT environment, Cisco and its partners are building the Intercloud—a globally connected network of clouds. Today, Cisco is taking another important step towards realizing our ambitious Intercloud vision.

    MyPOV – Good for Cisco to connect the intended acquisition with Intercloud – my analysis of the original announcement is here.

    We are pleased to announce our intent to acquire Piston Cloud Computing, which will help accelerate the product, delivery, and operational capabilities of Cisco Intercloud Services.

    Paired with our recent acquisition of Metacloud, Piston’s distributed systems engineering and OpenStack talent will further enhance our capabilities around cloud automation, availability, and scale. 

    MyPOV – Piston Cloud’s product delivers a ‘cloud os’ that can ‘plug in’ a number of cloud services. As such Piston Cloud gives a large number of using different technologies, something that enterprises care about as they do not know for sure which technologies they want to use for their next generation application projects. Being able to run these from the same platform is a significant value proposition that Piston Cloud has delivered. Given the heterogeneous nature of the environments traditional Cisco customers operate, certainly a good move. My analysis of the mentioned Metacloud acquisition is here.

    The acquisition of Piston will complement our Intercloud strategy by bringing additional operational experience on the underlying infrastructure that powers Cisco OpenStack Private Cloud.

    MyPOV – So Piston Cloud will become part of the Cisco OpenStack offering – no surprise here. Good to have the clarity.

    Additionally, Piston’s deep knowledge of distributed systems and automated deployment will help further enhance our delivery capabilities for customers and partners.

    MyPOV – Fair for Cisco to admit that Piston Cloud has some serious chops bringing together such diverse technologies like e.g. Docker or Spark.

    To bring the world of standalone clouds together, Cisco and our partners are building the Intercloud. The Intercloud is designed to deliver secure cloud services everywhere in the world. Our enterprise-class portfolio of technology and cloud services gives customers the choice to build their own private clouds or consume cloud services from a trusted Intercloud Provider. The Intercloud provides choice of services, all with compliance and control. In a nutshell: we’re delivering cloud the way our customers need it.

    MyPOV – The marketing line for Cisco Intercloud. But the intended acquisition of Piston Cloud certainly gives more credibility and more crucially more capability into this direction. One can only speculate why only now these two have come together.

    Piston will join our Cloud Services team under the leadership of Faiyaz Shahpurwala, senior vice president, Cloud Infrastructure and Managed Services Organization.

    MyPOV – Always good to hear where acquisitions are going to be anchored organizationally. I have shared my concerns in regards of services leadership building products, but that is a general concern and happy to give Shapurwala and the team the benefit of the doubt. The future will tell.

    Overall MyPOV

    It can’t be coincidence of EMC, IBM and Cisco all doing acquisitions on the managed cloud / hybrid cloud space. And it makes sense for all three vendors to push the hybrid agenda, as all three of them have existing sales channels into local data centers. To a certain point it is the failure of the ‘public cloud only' vendors (e.g. AWS and Google) that enterprises do not demand ‘public’ cloud only in 2015. 

    Nothing that can be held e.g. against Cisco, who by itself tries to differentiate itself by ‘playing nice’ to the demands, needs and (maybe) angst of CIOs in regards of the public cloud. The interesting development is, that CIOs seem to be comfortable to let 3rd parties do the management of their private cloud infrastructure, but still want to see their data center being utilized. Write down time frames may play a role here, but overall an interesting development that I will spend more time on going forward.

    On the concern side, it is another acquisition and Cisco will have to hold on to the Piston Cloud team talent. But Cisco is an experienced acquirer and knows how to make sure that the key people stay around.

    So overall a good move by Cisco, the flexibility of Piston Cloud is something that has the potential to be a differentiator for Cisco InterCloud. Congratulations.

    More on Cisco

    • Market Move - Cisco wants to acquire Piston Cloud - gets more into cloud - read here
    • New Analysis - Another week another Billion - Cisco Intercloud - A different approach to cloud – better late than never - read here
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    We had the opportunity to attend the Teradata Influencer Summit held at the beautiful L’Auberge Del Mar in northern San Diego. When mentioning to other influencers that I would be en route for that meeting earlier in the week, I mostly gathered incredulous stares and comments like ‘are they still around and interesting’? I missed the first half day, but the next one and a half days gave a good insight into where Teradata is and where they want to be in the next years. 

    So here are my top 3 takeaways from the event

    A compelling vision– In recent times Teradata had been blamed for a lack of vision and / or a lack of sense of realism. I followed the Summit two years ago from the fences and back then an executive had an elaborate presentation to show why Hadoop would be a failure – or only a temporary fashion that would disappear as fast as it showed up. But quickly afterwards (and some colleagues say the Summit helped) Teradata reversed course, changed from a ‘struggle to snuggle’ approach vis a vis Hadoop, embracing more of Hadoop, partnering e.g. with MongoDB already a year ago (and again was a key sponsor of this year’s MongoDB World – also this week, event report here). All slides at the summit had Teradata, Aster and Hadoop based offerings. As a matter of fact the Teradata UDA includes all three database paradigms.
    The vision of the 'Sentient' Enterprise
    And along these capabilities Teradata is using its by now proven Querygrid capabilities, following the ‘co-existence’ route with Hadoop, or ‘federated query’ strategy all vendors with an age of 5+ years pursue. As part of that Teradata sees the world separated into three categories of systems – tightly coupled (Teradata), loosely coupled (Aster) and non-coupled (Hadoop). An interesting approach to the system landscape, that Ratzesberger used to explain the vision of the ‘sentient enterprise’. There are pros and cons with federation, see MyPOV section below.

    It’s coming together – Not sure when Teradata came up with the vision, but obviously work needs to be done here. Teradata needs to write the endpoint to execute queries on the Hadoop side, and it currently supports Hive for that. But Hive has known benefits and challenges, and Teradata will address the challenges soon, stay tuned on that topic. The other piece critical is that a tool that can transport data back and forward, extract as needed from a variety of data sources as desired, transform information, can apply rules ‘in flight’ etc. Teradata calls it a Listener, but it is much more. In the old times one could call it a bona fide ETL tool, only that today it scales to the 21st century requirements with streaming data, humungous data volumes, BAM and CEP like support etc. – a key product / tool to make the Teradata UDA fly. 
    The Analytical Application Platform

    Self-disruption built in– All successful technology companies struggle with disrupting themselves. It’s a price of success and harvesting the benefits of maturation of technology that create the potential for technology disruption. Co-President Wimmer remarked that all successful database companies, the 30+ year olds are setup for disruption today. Usually technology players react to disruption risks with acquisitions, try to insert knew product capabilities and employee talent and these approaches are usually less successful than intended for a variety of reason.

    Teradata has taken a unique approach here, allowing the acquisition of ThinkBigA to keep operating independently, and even taking a step further, funding its global expansion. Unique, because ThinkBigA does pure best of breed BigData projects, with complete independence of choosing technologies customer’s desire. As such ThinkBigA cannibalizes Teradata customers to a certain extent, and remarkably Teradata does nothing to stop that.
    The Teradata free ThinkBigA architecture
    So kudos to the Teradata management, that coming of a close to death experience in regards of the Hadoop phenomena, now has a built in disruptor with ThinkBigA. Quite a gutsy move, coming close to e.g. an enterprise ERP vendor allowing a subsidiary services company to sell services on competitor products to its customers. What? Granted the ThinkBigA case is not that drastic, as the provider ‘only’ uses open source, but the disruption is on hand. The benefits for Teradata are twofold: The vendor learns about real customer demands and quality of open source BigData products, that it can use for its own R&D strategy and effort. Secondly Teradata has a way to remain relevant (and extract revenues) from customers it otherwise would have lost to BigData / Opensource players anyway.
    ThinkBigA BigData Project recommendations, notice bullet #5 - emphasis added


    Teradata is alive and well, with solid product plans for the immediate future. The three tiered vision of federated queries and data is common for vendors with existing, substantial business. It resonates well with enterprises that are more conservative both in regards of technology adoption and security needs.

    On the concern side, the verdict is out if enterprises operating with a federated storage and query approach will end up missing insights, or only getting served sub optimal insights. Conceptually one can make the case that every insight can only be caught with a single source storage, one place to be queried etc. but if the difference between the two is substantial for the success of a business, remains to be seen.

    So a key 2nd half for Teradata on the product side, with a number of substantial announcements to come, key capabilities in the pipeline. Teradata may have been down, but it’s back on its feet and throwing punches. Only the paranoid survive and Teradata has gotten a litte more paranoid - in a positive way. Good to see for prospects, customers and the overall ecosystem as nothing fuels innovation better than competition. We will be watching.

    0 0

    So we looked at the need for enterprises to accelerate last week, find the blog post here, with more of a technology view, as all good ‘inner values’ are based on a decent architecture of the right technology. More specifically these are the enablement of BigData and the provision of ‘true’ analytics (my definition here). 

    But even the right technology and architecture can fail, if the design principles aren’t right – and it’s key to keep in mind that business user centricity is paramount for successful, next generation HR systems, that help enterprises to get faster. No surprise, as layers of organization, a Tayloresque organization model, competency splits and more have slowed down organization since… forever (ok, the invention of these principles, that ironically were designed to make organizations more efficient, faster).

    So let’s look at the next set of three criteria for a successful next generation HR product:

    ‘Lean’ Recruiting

    Recruiting is one of the most challenging positions in the enterprise. CEOs (usually) have 2-3 years before they are removed for bad performance, sales reps get 2-3 quarters, recruiters 2-3 weeks. That’s still better than 2-3 hours for a waiter or a retailer sales clerk, but probably one of the most scrutinized positions in the enterprise. If a recruiter cannot bring on board the talent an enterprise is looking for after a few weeks, they will be looking for a new job soon. If you keep in mind unfavorable workforce dynamics, the need for acceleration etc. that job will not get easier.

    So let’s keep in mind what the recruiter role was originally put in place for. Have a professional who does nothing but recruiting and is therefore better at it than the occasional recruitment by a manager in the line of business. And save time for that manager, so they don’t have to sift through many resumes, initial interviews etc. But at the same time the creation of the recruiter established a disconnect between the hiring manager and the candidate, that has to be overcome for every vacancy. The fact that up to a third of position in North America are not being filled due to perceived ‘friction’ between recruiters and line of business managers is telling. It looks like in too many case managers prefer to settle for hope (that employee will get better), have their team work more (‘so hard to find good people’), or kick the can down the road (the next manager can fix this) – than for recruiting new talent.

    The good news is that with the advances in BigData, cheap, elastic compute capability from the cloud that allows for ‘true’ analytics, the recruiter can be largely bypassed and a filtered, fitted list of candidates can be served to the manager. The manager has to make the final call anyway, so instead of a recruiter working on many resumes to whittle down the list of good applicants to the final interview, managers can today get the list of final candidates, with the help of next generation recruiting software. We are at the cusp of video analysis to even add early conversation to the data trough from which managers can find the best candidates, after software has created the shortlist. And some vendors have even changed the recruiting model. Thanks to the advances of technology, headhunting techniques can be applied. Formerly only reserved to 6 digit salaried positions, the searching of social networks (interestingly Facebook beats out the ‘network of liars’, LinkedIn) for best fit candidates empowers managers to contact the best candidates directly, while they are still in their current jobs. This follows the old adage of the best people not looking for jobs (but being happily employed somewhere).

    So empowerment of the business user, usage of Analytics and BigData – enabled ‘lean’ recruiting. With no recruiters involved (who will become coaches for managers and applicants, product managers at software companies etc.) – certainly an aggressive vision, but ask any manager out there if they would like it. They are more likely going to say Yes than No… which means it will happen, sooner than later.

    Talent ‘Depth Chart’

    Sports teams, security and military teams have depth charts. A coach needs to know who can play the same position, should she decide to substitute the player on the field, should the player be injured, booked etc. Likewise there can’t be an ‘emergency’ meeting when the heavy machine gunner is incapacitated… so why has the Talent ‘Depth Chart’ not been enabled for positions in an enterprise?

    So let’s look what it means first: A manager should see at any given time how well the current holder of a positon is performing. Who else in the enterprise can do that job, and can do a better job than the incumbent. What talent (using ‘lean’ recruiting) above could be hired from outside the enterprise. Or if the position may not be around much longer and the local labor laws are challenging to shut down the position – what candidates from contingent worker sources could be hired. A contractor maybe even the best fit to the position, regardless of the employment status.

    Again the good news is that vendors are actively thinking about this. It is relatively easy to see how well an employee is performing, the manager will have an opinion on this anyway. But it is hard to do something about it. So finding coaching, learning and mentoring tools and options is the first step. Finding good fits in the enterprise is the next step. Easing the conversation with the manager of a professional that a manager would like to ‘poach’ with a ‘one’ click automation. Giving the phone numbers of the best 2-3 external candidate fits for a position. Same for the best contractors. Show how well the can do the job, where would the rank on the ‘Talent Depth’ chart for that position? All these functions can be build today, enabling the manager to assess incumbent and potential talent for a position.


    I wrote about Transboarding the first time in summer of 2014 (more here) – it’s a word constructed out of merging transfer, off- and onboarding. A lot of effort, time and resources are spend on Recruiting and Onboarding and they are key HR functions, but the most common HR event, the transfer is very little automated and supported.

    Consider the manager using a ‘Talent Depth’ chart as described above. She identifies a suitable individual in another department. Let’s got for the easy scenario – the individual wants to transfer, his manager is supportive, even better needs the current incumbent in the current position. Basically a talent swap. Even such a smooth and simple case is substantial work with HR Core, Training and Talent Management systems. If we talk hourly workers, add the complexity of Workforce Management. What if the manager could do an ‘electronic handshake’, set the date and the rest would be automated? The APIs for these functions are available. We are only waiting for vendors to ‘glue’ them together for very powerful automation in a flexible, the business user empowering way.

    Being able to Transboard people efficiently will be key for enterprises going forward. We already known enterprises need to become faster, if they can allocate people faster as needed, they will get faster, too. Not to mention the higher satisfaction of employees who can rotate through positions faster, without haggling managers, without fear of upsetting the current manager and so on.

    Stay tuned for Part III.

    0 0

    We attended the Teradata Influence Summit last week in Del Mar, North of San Diego, (Progress Report here) and fresh off the heels of this event, and well timed before press release tsunami likely to happen around HadoopWorld, Teradata announces it support for Presto.

    So let’s dissect the press release (find it here) in the custom Constellation style:

    SAN DIEGO – June 8, 2015 – To make it easier for more users to extract insights from data lakes, Teradata (NYSE: TDC), the big data analytics and marketing applications company, today announced a multi-year commitment to contribute to Presto’s open source development and provide the industry’s first commercial support. Based on a three-phase roadmap, Teradata’s contributions will be 100 percent open source under the Apache® license and will advance Presto’s modern code base, proven scalability, iterative querying, and the ability to query multiple data repositories.

    MyPOV – Interesting and good move by Teradata, partnering with a promising open source initiative with good DNA (originally from Facebook) and proven practical usage (e.g. AirBnb, Facebook and more). As part of the Teradata UDA, the vendor used to execute its federated queries on the Hadoop side on Hive, with Presto it get a more generic and more powerful support for these queries. And good to see that Teradata will be a ‘good citizen’ for open sources, becoming a (substantial) contributor to Presto. Also good to see it is a multiyear commitment, and the roadmap (we saw it under NDA) is rich, but realistic.

    Developed and used by Facebook, Presto is a powerful, next-generation, open source SQL query engine which supports big data analytics. There is a growing interest in Presto, as these corporations have adopted it: Airbnb, DropBox, Gree, Groupon, and Netflix.

    MyPOV – And here is the value, Presto is SQL base, a commonly known language for millions of business users and analysts. Bringing the ‘SQL back to no-SQL’ is a giant quest, and Presto is one of the more successful initiatives on that strategy path. It also has great user DNA.

    Presto complements the Teradata® QueryGridTM and fits within the Teradata® Unified Data Architecture™ vision. Presto integrates with the Teradata® Unified Data Architecture™ by providing users the ability to originate queries directly from their Hadoop platform, while Teradata QueryGrid allows queries to be initiated from the Teradata Database and the Teradata Aster Database all through a common SQL protocol.

    MyPOV – Teradata lays out the strategy here, which is good for transparency. Querygrid will be used on the Teradata and Aster side, throwing off queries to Presto as needed, no surprise. But Teradata will make its Presto offering open for direct Hadoop queries, a good move.

    Presto is agnostic and runs on multiple Hadoop distributions. In addition, Presto can reach out from a Hadoop platform to query Cassandra, relational databases, or proprietary data stores. This flexibility allows Presto to combine data from multiple sources, allowing for analytics across the entire organization through a single query. This cross-platform analytic capability allows Presto users to extract the maximum business value from data lakes of any size, from gigabytes to petabytes.

    MyPOV – The paragraph describes the value that Presto brings pretty well, from Presto a user can query pretty much anything. So Presto gives Teradata flexible data access again, but not from the Teradata level, but the Presto / OpenSource level. A very new approach for Teradata, but a good sign as it shows that Teradata is walking the path of times, which has a clear rise of open source at its end.

    Teradata’s three-phase contribution to 100-percent open source code will advance Presto’s enterprise capabilities, which benefit customers.

    Phase 1 - Enhance essential features that simplify the adoption of Presto, including installation, support documentation, and basic monitoring. The Phase 1 capabilities are available today for download at or on Github

    MyPOV – Kudos for laying out a roadmap, always something appreciated by the ecosystem. Apparently the installation of Presto was not trivial, so Teradata focused on that logically as Phase 1/

    Phase 2 - Integrate Presto with other key parts of the big data ecosystem, such as standard Hadoop distribution management tools, interoperability with YARN, and connectors that extend Presto’s capabilities beyond the Hadoop distributed file system (HDFS). These features will be available at the end of 2015.

    MyPOV – This will be the key release for the Teradata / Presto offering.

    Phase 3 –Enable ODBC (Open Database Connectivity) and JDBC (Java Database Connectivity API) to expand adoption within organizations and enhance integration with business intelligence tools. Enhance security by providing access based on job roles. These enhancements will be completed and available in 2016.

    MyPOV – And this will be to make the Teradata / Presto release very, very attractive to SQL savvy business users (and developers).

    In addition to its open source contributions, Teradata commercial support is now available from Think Big consulting. Think Big will offer its proven expertise in three areas to enable users to feel confident about putting Presto into production with assistance:

    Presto Jumpstart – In the cloud or onsite, Think Big will assist with piloting new functionality
    Presto Development – In the cloud or onsite, Think Big consultants will help customers design, build, and deploy a Presto solution
    Think Big Academy - Two-day workshops will help customers understand the best uses and criteria for architectural decisions.

    MyPOV – No surprise – Teradata will offer services here, and Thing Big is the place where Teradata offers these. A good move.

    Overall MyPOV

    Opensource is on the rise. In the last 12 months we have seen more and more open source uptake from Oracle, IBM and even outspoken past open source sceptics like Microsoft and SAP. Major ‘gifts’ have been made to open source – think of Pivotal’s recent move (see here). This all means that even skeptical enterprises have no choice than to implement, run and operate open source. The good news is, that vendors see their opportunities on the services side, which will have to be paid for, but overall it looks (for now) as if open source is a significant relief to IT budgets. The less shared secret is – it save time and reduces R&D budgets and efforts at ISVs, too.

    Closer to Terada – a very smart move. Take a promising open source offering, free from competitor influence, and own the place. Contribute generously and lavishly to the roadmap, be a good open source citizen, and own it even more – all good moves. 

    On the strategic side Presto is a huge hedge for Teradata – in the worst case scenario (Teradata ‘classic’ business slowly winding down), this is the first step of re-inventing Teradata on Hadoop. We will see if it comes to that, but a hedge is a hedge, even if not needed. The cross database type capabilities of Presto are very attractive. 

    Teradata has done a good first move here, it will be interesting how the competition responds (find other open source initiatives – or even join Presto?). We will be watching. 

    More on Teradata
    • Check out my colleague Doug Henschen's view on Presto and the recent analyst event - read here
    • Progress Report - Teradata is alive and kicking and shows some good 'paranoid' practices - read here

    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    This morning, Unit4 a European ERP vendor, announced the acquisition of Three Rivers Systems, a vendor specialized at helping the Higher Education sector to run efficiently.

    So let's dissect the press release in our custom style (it can be found here):

    Unit4, a fast growing leader in enterprise applications for service organizations, today announced the acquisition of U.S.-based student management solution provider, Three Rivers Systems, Inc. The acquisition supports Unit4’s strategy to offer market leading industry-specific solutions to people-centric businesses. Education has always been a key pillar of the company’s strategy. Together with Three Rivers Systems, Unit4 serves more than 1,000 clients in Education globally.
    MyPOV – A focus for Unit4 is to grow further in North America, and buying local vendors who have the local market knowhow, understand the market and requirements and also have customers is a proven expansion strategy for enterprise software vendors.

    The Unit4 People Platform enables Three Rivers Systems’ student management system to seamlessly work together with Unit4 applications, creating the first next generation end-to-end ERP for Education. The system incorporates a powerful and flexible localization framework, which allows Unit4 to cater to local needs, such as Financial Aid in the U.S. or the Student Loan Council in the UK.
    MyPOV – Acquisitions are always a good showcase for platform flexibility of the acquiring vendor. It is good to see that Unit4 sees the People Platform being able to handle the Three Rivers Systems application. And good point to raise the local market expertise of Three Rivers Systems.

    “The Education market currently lacks a modern and complete business application platform ready to be deployed globally. This is exactly what Unit4 and Three Rivers Systems will jointly deliver,” says Jose Duarte, Unit4 CEO. “This acquisition has created a global market leader for Education ERP solutions, and significantly strengthens Unit4’s market position in North America. By combining technologies and resources, we are the first to deliver the full suite of next-generation ERP for the global Education community.”
    MyPOV – The usual quote, but very true on the lack of a modern enterprise platform for higher education, which faces a piece meal IT infrastructure. With higher education institutions under pressure to reduce costs, IT needs to find ways to run more efficiently, while supporting 21st century best practices for both students, parents and employees.

    With three decades’ experience developing enterprise solutions exclusively for colleges and universities, Three Rivers Systems is highly regarded in the U.S. for its best in class student management solutions. The acquisition of Three Rivers Systems’ Student Management technology and tight integration with Unit4’s broad and feature rich Finance, Human Relations and Research Management applications will result in the industry’s first and most advanced and complete global ERP for Education. New and updated features of the combined offering include: Human Resources & Payroll, Finance Management, Research Management, a complete Student Management system covering Student Acquisition, Course Management, Retention, Degree Auditing and much more.
    MyPOV – Good description of the upcoming planned scope.

    Whether adopting new process models or integrating with a newly introduced system, Unit4 is able to easily adapt to the ever changing business requirements and needs of Education institutions. The platform helps grow enrolments, improve student success, and improve overall institutional effectiveness through e.g. the incorporation of built-in data analysis. A truly innovative and clean interface lets users benefit from an easily responsive system with optional touch based capabilities when using a tablet.
    MyPOV – Good description of the aspiration of the future system and some of the capabilities that the system has.

    “Unit4 and Three Rivers Systems shared focus on product innovation plus people and student centric applications perfectly cover the front and back office of today’s Education institutions,” says Amir Tajkarimi, President & Founder of Three Rivers Systems. “Unit4 will help to expedite innovation cycles around Three Rivers Systems’ advanced student management technology, while also bringing it to the global market. The acquisition will result in strengthened global sales and support teams as well as increased bandwidth for ongoing research and development around our combined product suite.”
    MyPOV – No need to comment on the usual quote.

    Three Rivers Systems aligns tightly with Unit4’s Cloud Your Way deployment model, so that the new Campus suite will support higher education institutions’ preferred ERP architecture, whether in the cloud, on premise, or hybrid. Unit4 and Three Rivers Systems’ solutions will be integrated via the Unit4 People Platform with Social, Mobile and Analytics driven capabilities.[…]
    MyPOV – Good to mention, guess it was to easy to offer a roadmap, but that is the next logical questneeion.

    Unit4’s ERP is the only system designed from the ground up for people-centric industries. It empowers people to better engage and create impact, while automating low-value repetitive tasks.

    Overall MyPOV

    A good move by Unit4, which needs to find ways to grow fast in North America. Though Higher Education is only the 3rd most important vertical for the vendor, it is currently a good opportunity in North America to capitalize on. Higher Education institutions are in an innovation cycle, after almost decade(s) in the doldrums. Credit goes to Workday with steering up the industry with its new Student offering. A good momentum to exploit, especially with a smaller scale customer target.

    On the concern side it is one more thing for Unit4 to do. It not only needs to push Higher Education now, but the other verticals it is focusing on, too. On the product side Unit4 needs to integrate 3 Rivers Systems – and we are not concerned of the new People platform not being up to task – but it is work that needs to be done quickly.

    Overall a good acquisition for Unit4, I am sure we will learn more about it at the vendor’s analyst day this week.

    0 0

    We had the opportunity to attend Globoforce’s Workhuman event, currently being held in Orlando. A year ago this event was a small 100 attendees affair, this year it has grown to about 500 attendees. Always good to see such growth, largely through positioning the event not as a user conference, but a work / life / engagement conference, sponsored by Globoforce.

    So here are my Top 3 takeaways of the event:

    Engagement matters– No surprise, engagement was core and center at this conference, and that’s what Globoforce is known for and very good at. The perspective angle changed though a little bit – as the view now is more about working more ‘human’, bringing back the ‘humanity’ to the workplace. Humans do well with feedback, especially when it is positive, and all of that leads to more engaged employees. It was interesting to learn that video feedback works even better, something seems to kick in in inside our brains when we see positive feedback being given to us.

    And more engaged employees don’t leave, don’t call in sick and deliver more, are overall more productive. So engagement is something that employers should all strive and go for. Why they aren’t remains one of the mysteries of the 21st century – and a challenge for vendors like Globoforce. Performance Management at large remains broken in North America and Europe, and we know that more continuous feedback is key. CEO Mosley went so far as to call it ‘crowdsourcing’ the feedback process. With an aging and more and more expensive workforce, we think time plays in the hands of Globoforce (and similar vendors), as employers cannot afford a disengaged workforce as little as they cannot afford to understand where and who their talented employees are. 

    CEO Mosley welcomes attendees

    Partnership with IBM– Globoforce and IBM unveiled a partnership at the event. Basically Globoforce will become a data source for IBM’s cognitive computing platform, Watson, where combined with the Kenexa and other data it will help to create more powerful analytics for HR professionals (using the IBM Kenexa Talent Insights product, powered by IBM Watson Analytics). As common these days the integration will be vendor supported, with Globoforce maintaining the interface for their data towards IBM, which is a sensible approach. Both vendors will explore common analytical offerings, but I guess it was too early for more specifics here. A good move for both vendors as Globoforce gets access to a cognitive computing platform that is being built by a R&D team funded by multiple times Globoforce revenue (no numbers disclosed – guestimate here) and IBM gets another data source to make Watson predictions more powerful. On the IBM side the move is another step in IBM’s strategy to acquire data sources for better insights, on a macro level see e.g. the recent partnerships with Twitter and weather providers. The press release on this announcement can be found here.

    Happy Workers work Harder, from Mosley' presentation

    Rich roadmap – Closer to home for Globoforce, the vendor plans to improve reporting and analysis capability. Not surprising as reports can show if the social engagement and feedback solution that Globoforce offers, really works. As typical for a SaaS vendors, releases come out quarterly and the roadmap is rather fluid and dynamic, so it will be interesting to see what Globoforce does on the overall functionality side as well as what will come out of the partnership with IBM in regards of ‘true’ analytics (those who take an action and / or make a recommendation and in 99% of the case don’t show colorful pictures).

    Visualization of recognition in Globoforce product


    A good event for Globoforce, with above average keynote speakers for an event of this size, and a dynamic mix of customers and prospects who are all energized to increase employee engagement through recognition. The setup for Globoforce is favorable, as its product needs only minimal setup - e.g. users - and is good to go - so perfect for a cloud based, next generation Application.

    On the concern side we found a few inconsistencies in the Globoforce user interface, something the vendor can address with some easy and quick housekeeping releases. The basic functionality – reward / recognize – is straight forward and easy to use.

    It will be interesting to see what happens to the overall employee recognition space in the next 4-6 quarters – we will be watching. In the meantime congrats to Globoforce to a great event.

    More interesting blog posts:
    • Musings - Speed matters for HR - how to accelerate - Part II read here - Part I read here
    • Musings - Is TransBoarding the future of People Management - read here
    • Musings - What are 'true' analytics - a manifesto - read here
    • Musings - How Technology Innovation fuels Recruiting and disrupts the Laggards - read here
    • Musings - What is the future of recruiting? Read here
    • HRTech 2014 takeaways - read here.
    • Why all the attention to recruiting? Read here.
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    This morning Google and JDA software surprised the markets with a press release that announced that Google Cloud Platform (GCP) is becoming JDA’s public cloud infrastructure. This is obviously a first for JDA and Google, not for the market (see below).

    So let’s dissect the news release in our customer style (it can be found here):

    Scottsdale, Ariz. – June 10, 2015 –JDA Software Group, Inc. today announced an innovative new collaboration with Google aimed at leveraging the core strengths of both companies to deliver JDA’s next generation cloud-based omni-channel and supply chain solutions via Google Cloud Platform, a powerful public cloud offering. Through this collaboration, Google will provide a uniquely scalable and flexible technology platform via the cloud to support JDA’s future application development and delivery.

    MyPOV – Good move by JDA, as choosing a public cloud provider saves substantial CAPEX that would have gone into data center build-out. Now JDA can invest those savings into product. For Google it means more load for GCP, more enterprise exposure and likely the start of a number of ISVs signing up for Google as cloud provider. Load is essential for all IaaS vendors, as it ensure the economies of scale they need to procure attractive supply chain prices and rates. The untapped potential of on premises enterprise software is one of the largest growth potentials for public cloud vendors overall. Take JDA with 4000 customers, ultimately they will all have to go to public cloud solutions. That is a lot of server capacity. And finally a lot of customers to cross sekk Google Apps and Google for Work to.

    “Google Cloud Platform offers the unparalleled speed, performance, scalability and reliability we need to launch truly differentiated solutions. After thoroughly evaluating potential Platform as a Service (PaaS) providers, JDA chose to work with Google due to its unsurpassed technology platform, investments and deep culture of innovation,” said Serge Massicotte, executive vice president and chief technology officer at JDA Software.

    MyPOV – There is little doubt GCP outperforms other clouds on the performance side of the overall infrastructure. My recommended simple test is to monitor the speed of email provisioning when travelling internationally. I do that a lot and still have to find the place on earth where my Gmail account does not beat e.g. my Office account. And it is no surprise – the core business model of Google, advertisement, needs very, very fast servers and networks. More surprising is that the JDA statement refers to Google as a PaaS, likely meaning Google App Engine (GAE), which has been a less popular choice by enterprise software vendors. We need to learn more about JDA’s plans and use case to understand this better.

    This collaboration, which will significantly accelerate the development of JDA’s next generation cloud solutions, is JDA’s most recent initiative aimed at delivering innovative products and services for its customers. With an unmatched R&D investment in supply chain and omni-channel solutions, the company recently formed JDA Labs– a dedicated research and development group committed to delivering patents, best practices and entirely new products to the market. Google Cloud Platform initiatives will be developed out of the JDA Labs in Montreal. JDA’s work with Google also complements JDA’s newly announced FLEX platform strategy, which easily connects JDA’s existing cloud-based solutions and on-premise solutions with next generation solutions built on Google Cloud Platform.

    MyPOV – Two good moves by JDA. Forming a lab for more innovative work has been a proven approach in enterprise software, see e.g. also the SAP Lab network. The FLEX platform is an interesting approach bringing together more traditional JDA platforms with its next generation cloud strategy.

    “With thousands of successful customers — including 21 customers named as part of the Gartner Supply Chain Top 25 for 2015— JDA has clearly established its leadership in delivering world-class retail and supply chain solutions,” said Massicotte. “To maintain and expand that leadership, JDA is focused on developing new innovative products and services that will truly change the supply chain landscape. By working with Google — an established innovation leader — JDA will concentrate on working with our customers to co-develop these groundbreaking solutions with Google Cloud Platform, providing an unmatched foundation. It’s a huge win-win for JDA customers, who will benefit from best-in-class solutions, delivered rapidly, from two proven market leaders working together.”

    "We're thrilled that JDA has chosen to work with Google Cloud Platform to develop their next generation of products and services that will change the supply chain landscape," said Dan Powers, director, Google Cloud Platform. "The supply chain and omni-channel industry is ripe to benefit from the innovation, scale and flexibility of our public cloud offering, and by betting on Google, JDA can now focus on creating high impact business solutions while quickly adapting to meet customer needs." […]

    MyPOV – The usual quotes – no commentary needed.

    This month, JDA will be part of the keynote at a series of Google Cloud Platform Next events worldwide that highlights our work together. JDA executives will be featured speakers at Next events in New York on June 12, San Francisco on June 16, London on June 23 and Amsterdam on June 25. Learn more about the Google Next event series here.

    MyPOV – Well good to be able to promote the new offering at Google events. I would not be surprised to hear a repeat of ‘Infor – who?’ (like at AWS Cloud event in March 2014 in San Francisco) in the form of ‘JDA – who?’ – at these events – but that is all part of becoming known as a vendor build on the public cloud.

    Overall MyPOV

    It is a Win / Win / Win for the partners and JDA customers. JDA customers should see a better return of R&D given the choice JDA has made for GCP. Let’s not underestimate the TCO aspect in this partnership, too – as Google has recently lowered prices (again – read here) – and many JDA customers turn the penny twice before they spend it. JDA saves CAPEX that it can put into its product organization and roadmap. Google gets load that helps it to scale better.

    On the concern side, JDA is the first enterprise vendor to opt for Google. Certainly JDA has done good due diligence and Google is motivated to make it a success, but being first has risks – but also rewards when done right and successfully. And it is clear for JDA customers that the bulk of R&D going forward will be on public cloud platforms. Like it or not, customers should prepare and accommodate for that.

    But in the meantime congrats to both vendors for a synergistic partnership, very likely many more to follow.

    More on Public Cloud Firsts:
    • Infor runs on Amazon AWS (read here)
    • SAP on IBM Cloud (read here
    • Lumesse on Salesforce (read here)  and
    • NetSuite on Azure (read here). 

    More about Google:

    • Event Report - Google I/O - Google wants developers to first & foremost build more Android apps - read here
    • First Take - Google I/O Day #1 Keynote - it is all about Android - read here
    • News Analysis - Google does it again (lower prices for Google Cloud Platform), enterprises take notice - read here
    • News Analyse - Google I/O Takeaways Value Propositions for the enterprise - read here
    • Google gets serious about the cloud and it is different - read here
    • A tale of two clouds - Google and HP - read here
    • Why Google acquired Talaria - efficiency matters - read here

    Find more coverage on the Constellation Research website
    here and checkout my magazine on Flipboard.

    0 0

    We had the opportunity to attend Unit4’s North America analyst summit (the vendor had the same event for Europe based analysts last week) in Boston this week. The event was held on top of the 60 State Street building with gorgeous views of Boston. Even the local colleagues could not stop taking pictures. Great location for similar events.  

    We learnt many things – here are my Top 3 takeaways:

    A compelling Strategy– In the general overview on the company, CEO Duarte admitted that Unit4 has gone quiet for the most of the last two years and used the time to reposition the company for growth on both the product side and the go to market side. Duarte ran us through the 4 pillars of the Unit4 strategy – vertical solutions for services enterprises, applications for people, an agile architecture and native, authentic cloud solutions. All good strategy pillars and it was good to see that Unit4 could show the first deliverable son the product side at the end of the day. On the go to market side Unit4 wants to grow its salesforce by 20%, and has a focus on North America, the UK and France / Germany. Unit4 intends to grow the partner channel, with unusual openness the vendor shared that right now the service relationship between Unit4 doing implementation in-house vs a partners implementing is close to 1:1 – the intention is to freeze Unit4’s internal portion of services and grow the partner side. Duarte shared the ambition to create a 1B partner ecosystem in the next years. Certainly a good strategy to get the attention of service partners. 
    A great overview slide on Unit4

    A new Architecture– Unit4 shared its new product architecture, articulated across 4 layers. No surprise, the vendor wants to provide an attractive, consumer grade UI on top of competitive vertical capabilities. So far pretty common objectives across enterprise software players. What sets Unit4 apart are the next two layers, where the vendor stresses a smart context layer and asks for an elastic foundation. With dynamic context Unit4 can create richer and more powerful user interactions, enriched by ‘true’ analytics. The elastic foundation basically asks for a cloud deployment, basically an elastic cloud infrastructure that can run the application but also provide the compute power to create the context and analytics. Unit4 has some concrete plans here, stay tuned as the vendor will unveil them soon. Overall an attractive architecture, Unit4 showed some early demos in the area of Professional Services Automation (PSA), Sentiment Analysis and likeliness for invoices to be paid. All three encouraging first examples, but the vendor will have to cover significantly more ground to get more of them built and achieve release grade for later in the year.
    The Unit4 People Platform

    Vertical Focus backed up– Earlier this week Unit4 announced the acquisition of Three Rivers Systems, a vendor specialized on the Higher Education market with its CAMS product (I covered the acquisition in a News Analysis blog post that can be found here.). So a good proof point that Unit4 is serious both about vertical focus and growth in North America. The CEO of Three Rivers Systems, Tajkarimi was at hand to give us an update on the vendor and the products. Three Rivers has over 50 employees, over 200 customers in Higher Ed, based all over the world and of all sizes of student body. Its CAMS product is being rebuilt for cloud, in a very encouraging, declarative and open for self-service setup way. A graphical editors allow to design system behavior. No detailed deliver time lines were shared (yet), but the demo showed an attractive system. With the acquisition Unit4 leadership thinks they are well ahead of Workday for Higher Education, and the vendor is probably for now (but let’s see what Workday unveils later this year). Overall a strong vertical commitment is certainly welcomed and as long as executed right, will help Unit4 to differentiate in the markets where it operates. 
    The Unit4 view on the Higher Ed Market


    A very good analyst meeting for Unit4, which had an impressive analyst turnout, especially when one considers it is busy conference season. More importantly the vendor shows some interesting approaches for the product, with a dynamic context layer, the inclusion of ‘true’ analytics and cloud deployments. Equally the plans on the go to market side are plausible and realistic, something never to oversee and underestimate.

    On the concern side Unit4 has the luxury that it ‘only’ needs to execute. Cloud bookings are growing 70+, and the vendor needs to maintain that momentum. Equally it needs to deliver on the product side – no roadmap was offered, so the vendor will need to issue one soon and then deliver and execute to it. Getting all that done, carefully orchestrated internally, with partners and customers is no easy task. But Unit4 has an experienced management team that should be able to pull this off.

    Overall Unit4 was able to put down a differentiated and innovative vision for a next generation ERP application with a strong vertical focus on services industries. The markets are vast an enterprises looking for new and modern enterprise software. So a promising future for Unit4, its customers, prospects and partners, but the vendor needs to deliver. We will be watching.

    More on Unit4
    • News Analysis - Unit4 acquires Three Rivers Systems - read here

    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    And a collection of key tweets from the event in a Storify collection:

    0 0

    This morning SAP announced the release of its latest version of SAP HANA, its in memory database, with SPS10. 

    So let’s dissect the press release in our custom style (it can be found here

    NICE, France — June 16, 2015 — SAP SE (NYSE: SAP) today announced the release of service pack 10 (SPS10) for the SAP HANA® platform, helping customers successfully extend all the key functionalities of their core business to the edge of the network where remote business transactions and events actually occur. The latest release delivers new
    capabilities that help customers connect with the Internet of Things (IoT) at enterprise scale, manage Big Data more effectively, further extend high availability of data across the enterprise and develop new applications. The announcement was made at SAPinsider HANA 2015, being held June 16-18 in Nice, France.
    MyPOV – Good to see SAP to follow up with its plans from Sapphire (my event report here) and deliver the key new functionality side – which is all about the support of Hadoop for BigData and IoT scenarios. On top of that SAP has put in a lot of new functionalities beyond good housekeeping – but let’s read on.
    “SAP HANA gives customers one integrated platform for transactional and analytic workloads,” said Quentin Clark, chief technology officer, SAP. “The new capabilities of SAP HANA ensure data center-readiness, synchronization data to any remote system, extend high availability and disaster recovery of enterprise data and perform advanced analytics. We are readying our customers for the inevitable digitization of our entire economy.”
    MyPOV – Good quote from CTO Clark, correctly focusing on necessary (and some would say overdue) good database capabilities like data center readiness, remote synchronization and HA / DR. We know a number of enterprises have been waiting for these capabilities to start or extend their investments into HANA – so it’s good to see SAP delivering on these.
    Connecting to the Internet of Things at Enterprise Scale
    The new remote data synchronization feature of SAP HANA enables organizations to synchronize data between the enterprise and remote locations at the edge of the network. Developers can now build IoT and data-intensive mobile apps that leverage SAP HANA remote data synchronization between the enterprise and remote locations via the SAP® SQL Anywhere® suite – a leading enterprise-ready, embeddable database technology with more than 12 million licenses. Now, enterprise data can be securely captured, accessed and fed back into SAP HANA from remote workplaces such as retail stores and restaurants. In addition, customers can collect and analyze IoT data for performing critical tasks at distant locations including predictive maintenance on ships, pumping stations and in mines with low bandwidth or intermittent connections or even while offline.

    MyPOV – SAP keeps harping on the IoT use case which finally, with Hadoop support, becomes a real strategy for HANA customers. And it is good to see that SAP is leveraging assets from the former Sybase, the venerable but proven SQL Anywhere. SAP being able to use the SQL Anywhere offline and remote capabilities, included with the synchronization functions is a good move. More things than we think don’t have continuous, good, reliable wireless (or event network) access.

    Streamlining Data Access and Management of Big Data
    Businesses can continue to harness Big Data using expanded smart data integration capabilities of SAP HANA to the latest Hadoop distributions from Cloudera and Hortonworks. Additional enhancements for SAP HANA include faster data transfer with Spark SQL and a single user interface (UI) for SAP HANA and Hadoop cluster administration using Apache Ambari. IT organizations can also take advantage of new rules-based data movement among multiple storage tiers based on business requirements. For example, organizations can set up rules to keep one year of data in memory and set a rule to move older data to disk storage or Hadoop. Finally, customers can have greater confidence in the data they arecollecting with new smart data quality capabilities for SAP HANA to cleanse data and merge duplicates by using an easy-to-use, Web-based development workbench.
    MyPOV – A very good (and overdue) move by SAP allowing the usual hot, less hot and cold separation of data. A year ago mentioning Hadoop in SAP circles was a bad word, it is remarkable and applaudible (?) to see SAP turning the corner here and supporting the most prominent enabling database technology for next generation applications. Spark was already mentioned at Sapphire and will be key for SAP to make the future offering fly. More to come.
    It is also key for SAP to support both Cloudera and Hortonworks on the distribution side, SAP cannot pay favorites here (and should not either). And always good to see SAP use common open source tools like Ambari. And good to see more data cleansing and quality options, something always good to have and previously required 3rd party solutions.

    Extending High Availability and Scalability Across the Enterprise
    SAP HANA delivers new high availability and disaster recovery capabilities to help ensure data center readiness and support for always-on, mission-critical applications. Features such as 1-to-n asynchronous replication, auto-host failover for dynamic tiering and incremental backups help reduce system downtime and facilitate true business continuity across the enterprise. In addition, companies can leverage the NUMA (non-uniform memory access) -aware architecture of SAP HANA to support large scale systems with more than 12TB of memory to rapidly process large data sets and improve overall business performance. Enhancements in workload management help further improve mixed workload performance to optimize resource more effectively.

    MyPOV – Good to see SAP extending and catch up on HA and DR capabilities – something customers have been focussing on for since quite some time. Dynamic tiering is a key capability to optimize cost and performance and something customer equally do welcome. Finally NUMA support is key to keep HANA on the cutting edge of both processor and in memory capability, an important move by SAP. Though SAP has not been explicit on this, one can almost read the learning between the lines of SAP running more and more HANA applications for their customers. Along that comes important lessons learnt and key additions to the product. [...]

    Innovating Through Advanced Analytics
    With the expanded data processing capabilities in SAP HANA SPS10, businesses can accelerate the development of powerful applications with advanced analytics. SAP HANA text mining now extends to the SQL syntax, making it easier for developers to build next-generation applications. The new spatial processing enhancements of SAP HANA include support multidimensional objects and spatial expressions in SAP HANA models or SQLScript. As a result, developers can incorporate engaging visualizations in their business applications.
    MyPOV – To a certain point SAP HANA comes full circle, as it started with T-Rex – an in memory search solution that SAP developed a long time ago. Allowing now for SQL queries to enable text mining is like text search of T-Rex making it fully to the database world, being accessible with the universal database language SQL. Moreover, good to see SAP keeping investing into spatial support, which is critical to build real world modelling, next generation applications. Humans and things have locations and knowing and understanding them is crucial for a modern database. 

    Strong Momentum on SAP HANA
    The number of customers transforming their business with SAP HANA is dramatically increasing. SAP HANA currently has more than 6,400 customers, almost doubling from only one year ago. SAP HANA Cloud Platform has quickly built momentum with approximately 1,400 customers. SAP Business Suite 4 SAP HANA (SAP S/4HANA) has driven tremendous interest out of the gate with more than 370 customers in 2015 alone. The SAP Business Warehouse application on SAP HANA continues to have strong traction with over 1,900 customers. Adoption of SAP HANA by startups has soared, with more than 1,900 leveraging the SAP HANA platform today. In addition, there are more than 815,000 active users of SAP HANA.
    MyPOV – Kudos to SAP for sharing the most complete number on SAP HANA adoption. These are impressive numbers, but it looks like progress may have slowed down a bit, comparing with the very high growth numbers from the early HANA years. But all high growth needs to be converted in sustainable growth based on real use cases and here SAP has made good progress. Better 100 live customers and references than 500 interested parties that don’t do much (the number are completely my manufacturing, to illustrate the point).

    Overall MyPOV

    At the pre-kindergarten age of 4.5 years, HANA is growing up fast. It is good to see that SAP keeps supporting and extending older ‘growth spurts’ like spatial and not abandoning them. And HANA is helping SAP internally to host applications built on HANA, and that daily work has an influence on the product roadmap and releases. Using your own HANA has advantages. (Ok the analogy has to stop somewhere).

    Extending the HA / DR capabilities is key housekeeping, that to a certain point was even overdue. The exiting new capabilities are the Hadoop support enabling BigData and IoT use cases.

    On the concern side, SAP needs to keep expanding SAP HANA capabilities and drive to more customer adoption. It has made the right moves on the product side, now it needs to get the early customer adoption to show case the new capabilities, but that’s a good and natural problem to have. And as SAP has not abandoned some early development strands (e.g. special) but keeps supporting and extending them, it is clear that SAP has the R&D know how and budget to keep developing and investing in SAP HANA.

    Overall congratulations to SAP for a new and rich SAP HANA release with SPS 10 – it’s time for the 4.5 year old HANA to go to preschool soon (for the non US based reader that is the year before 1st grade in the US).

    And more on overall SAP strategy and products:

    • Event Report - SAP Sapphire - Top 3 Positives and Concerns - read here
    • First Take - Bernd Leukert and Steve Singh Day #2 Keynote - read here
    • News Analysis - SAP and IBM join forces ... read here
    • First Take - SAP Sapphire Bill McDermott Day #1 Keynote - read here
    • In Depth - S/4HANA qualities as presented by Plattner - play for play - read here
    • First Take - SAP Cloud for Planning - the next spreadsheet killer is off to a good start - read here
    • Progress Report - SAP HCM makes progress and consolidates - a lot of moving parts - read here
    • First Take - SAP launches S/4HANA - The good, the challenge and the concern - read here
    • First Take - SAP's IoT strategy becomes clearer - read here
    • SAP appoints a CTO - some musings - read here
    • Event Report - SAP's SAPtd - (Finally) more talk on PaaS, good progress and aligning with IBM and Oracle - read here
    • News Analysis - SAP and IBM partner for cloud success - good news - read here
    • Market Move - SAP strikes again - this time it is Concur and the spend into spend management - read here
    • Event Report - SAP SuccessFactors picks up speed - but there remains work to be done - read here
    • First Take - SAP SuccessFactors SuccessConnect - Top 3 Takeaways Day 1 Keynote - read here.
    • Event Report - Sapphire - SAP finds its (unique) path to cloud - read here
    • What I would like SAP to address this Sapphire - read here
    • News Analysis - SAP becomes more about applications - again - read here
    • Market Move - SAP acquires Fieldglass - off to the contingent workforce - early move or reaction? Read here.
    • SAP's startup program keep rolling – read here.
    • Why SAP acquired KXEN? Getting serious about Analytics – read here.
    • SAP steamlines organization further – the Danes are leaving – read here.
    • Reading between the lines… SAP Q2 Earnings – cloudy with potential structural changes – read here.
    • SAP wants to be a technology company, really – read here
    • Why SAP acquired hybris software – read here.
    • SAP gets serious about the cloud – organizationally – read here.
    • Taking stock – what SAP answered and it didn’t answer this Sapphire [2013] – read here.
    • Act III & Final Day – A tale of two conference – Sapphire & SuiteWorld13 – read here.
    • The middle day – 2 keynotes and press releases – Sapphire & SuiteWorld – read here.
    • A tale of 2 keynotes and press releases – Sapphire & SuiteWorld – read here.
    • What I would like SAP to address this Sapphire – read here.
    • Why 3rd party maintenance is key to SAP’s and Oracle’s success – read here.
    • Why SAP acquired Camillion – read here.
    • Why SAP acquired SmartOps – read here.
    • Next in your mall – SAP and Oracle? Read here.

    And more about SAP technology:
    • HANA Cloud Platform - Revisited - Improvements ahead and turning into a real PaaS - read here
    • News Analysis - SAP commits to CloudFoundry and OpenSource - key steps - but what is the direction? - Read here.
    • News Analysis - SAP moves Ariba Spend Visibility to HANA - Interesting first step in a long journey - read here
    • Launch Report - When BW 7.4 meets HANA it is like 2 + 2 = 5 - but is 5 enough - read here
    • Event Report - BI 2014 and HANA 2014 takeaways - it is all about HANA and Lumira - but is that enough? Read here.
    • News Analysis – SAP slices and dices into more Cloud, and of course more HANA – read here.
    • SAP gets serious about open source and courts developers – about time – read here.
    • My top 3 takeaways from the SAP TechEd keynote – read here.
    • SAP discovers elasticity for HANA – kind of – read here.
    • Can HANA Cloud be elastic? Tough – read here.
    • SAP’s Cloud plans get more cloudy – read here.
    • HANA Enterprise Cloud helps SAP discover the cloud (benefits) – read here
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    Earlier today Unit4 announced that it has picked Azure as its platform to build its next generation ERP applications.

    Let’s digest the press release in our custom Constellation style (it can be found here):

    Utrecht, Netherlands, June 16, 2015 – Unit4, a fast growing leader in enterprise applications for service organizations, today announced a strategic collaboration with Microsoft that will speed to market the creation of self-driving business applications and ERP for people-centric organizations.
    MyPOV – The enterprise software ISVs are picking their cloud – last week e.g. JDA picked Google Cloud platform – this week it is Unit4 with Microsoft Azure. A good move for both – but later.

    Unit4 will use the smart technology in Azure’s PaaS platform components and Microsoft Office solutions including predictive analytics, machine learning, event stream analysis and complex event processing. Combined with Unit4’s People Platform, the collaboration boosts speed of innovation and will see customers benefiting from a new approach to enterprise computing.
    MyPOV – Unit4 is using Microsoft where Microsoft is putting more R&D dollars than likely Unit4 has overall – so a smart move to leverage that. With Machine Learning, Office, PaaS capabilities Microsoft had created an attractive set of functions that should attract more enterprise software vendors than Unit4. But Unit4 gets the prize for first mover.

    Unit4 with Microsoft – making self-driving ERP a reality
    To deliver on the promise of self-driving ERP, Unit4 and Microsoft will establish a development team to ensure that innovations in Azure and Office are used as quickly as possible within the Unit4 People Platform. Contextual information will be derived by combining business data within Unit4 applications with relevant information from Office 365, Office Delve, email, calendar, Yammer and more. This will enable unprecedented business value. Professional services firms will be able to combine an analysis of historical data with predictive analytics to gain valuable insight on which projects to bid for. Public sector organizations will improve fraud detection on bill payments with advanced pattern recognition and machine learning. Not-for Profits will have the ability to match campaigns to donation patterns and target donors more effectively.

    MyPOV – Unit4 presented their next generation application platform last week at its North American analyst summit (my take here). It’s a modern and attractive architecture that is good to have Microsoft as a technology partner for. But know how transfer is always challenging, and with all software the devil is in the details so it is good to see both vendor forming a team. The Office 365 and Office Delve integration can be a substantial sales channel for Unit4, assuming done right.

    “Microsoft and Unit4 share a long history of increasing productivity for enterprises and this collaboration will accelerate the innovation necessary to make self-driving ERP a reality,” says Jose Duarte, Unit4 CEO. “Like a self-driving car, self-driving ERP takes care of the tasks that are better served by technology, leaving people to focus on the exceptions that need human intervention. Unit4 and Microsoft’s combined know-how and technology will set a new industry standard for business applications for people-centric industries.”
    MyPOV – Good quote by CEO Duarte, clarifying more the vision of self-driving ERP.

    To achieve this vision, ERP systems require access to complete and high-quality data. Traditional ERP provides users with non-intuitive empty forms, asking them to enter all the required data, leading to non-intuitive data entry, errors and frustration. Self-driving ERP dramatically simplifies data collection by utilizing key technologies such as predictive analytics to provide meaningful in-context information to users. Such information enables intuitive data entry based on pre-populated forms and in-context yes-no validation questions. It results in great user experience all the way from desktop to mobile to wearables.
    MyPOV – Unit4 shares a little of its magic around self-driving ERP – which has a heavy context component. We know context is very powerful, but ERP solutions have relied on the human users for the longest time to provide the context. If Unit4 can capture and insert the context into ERP applications, it will be very powerful, eye opening ERP capabilities.

    Working together in the cloud
    Azure will be Unit4’s preferred public cloud deployment platform globally. Microsoft’s approach to Azure IaaS cloud deployments and Unit4’s Cloud Your Way methodology, which gives customers the flexibility to run their applications in public and private clouds, are key components and the foundation of this collaboration. Both organizations also cater to a hybrid computing environment which will be the reality in the enterprise world for some time to come. Azure enhances Unit4’s offering through flexible security, data privacy and residency. With a deep understanding of the challenges facing the modern enterprise, Unit4 Business Applications with Azure offer highly scalable ramp-up and ramp-down capabilities and provide customers with the flexibility and adaptability they need to run their business without disruption.

    MyPOV – More detail on what both vendors plan to deliver. Good to see acknowledgement of the hybrid reality, which is key for ERP sales these days. Data privacy and residency is another key component, so good to see that both are looking at this.

    “Unit4’s business applications and People Platform made them a strategic company for us to work with to help realize self-driving ERP,” stated Nicole Herskowitz, Senior Director of Product Management, Microsoft Azure. “We are excited to be working with Unit4 and through this collaboration, we are not only delivering accelerated value but also enabling faster time to innovation in the cloud for our customers.”
    MyPOV – Good for Unit4 to get a product development executive for the quote, which always bides for more commitment than e.g. a business development executive. Both vendors will need to work closely together to pull this off, and it is good to see that the senior partner, Microsoft has skin in the game. […]

    Overall MyPOV

    It’s time for the smaller enterprise software vendors to pick their clouds… Last week it was JDA (with Google), NetSuite picked also Microsoft (though to a lesser extent than Unit4). Even SAP is up to a partnership with IBM last fall. It all started with Infor choosing AWS a year ago.

    Unit4 has done a very good job to articulate how key Microsoft capabilities in Azure, Machine Learning, Office and PaaS will be leveraged. They are crucial to make Unit4’s vision of ‘self driving’ ERP a reality. As such Unit4 probably takes the largest dependency of all ERP vendors, but also is up for the biggest upside. The good news is that Microsoft needs to make all these capabilities work in order to attract Azure business. So a pretty safe bet for the much smaller Unit4 to make.

    On the concern side, Unit4 needs to make a very compelling, but also not trivial ERP vision happen with ‘self-driving’ ERP. No other vendor has shipped dynamic context functionality with a breadth of an ERP system. No small undertaking and Unit4 needs to make it work. But it is better to have to execute on a compelling vision than not having e.g. a vision at all. We expect Microsoft to support Unit4 at its best, as it needs to create an enterprise ISV showcase in the overall IaaS and PaaS market.

    Overall it is good to see smaller vendors going after a better R&D return by leaning on technology offerings of larger, technology partners. To some point that has always happened in the past, e.g. for RDBMS. In the 21st century the dependency gets bigger, but with that the applications become also more powerful. Congrats to both vendors to this partnership, we are curious to see what will be developed.

    More on Public Cloud Firsts:
    • Infor runs on Amazon AWS (read here)
    • SAP on IBM Cloud (read here
    • Lumesse on Salesforce (read here
    • NetSuite on Azure (read here) and 
    • JDA on Google (read here).

    More on Unit4:
    •  Progress Report - Unit4 lays out a big vision - now it needs to execute - read here
    • News Analysis - Unit4 acquires Three Rivers Systems - read here

    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    Probably the hottest area in building next generation applications these days is microservices, running on containers. One key challenge with containers has been that they do not allow to persist data. Enters ClusterHQ which addresses it with its Flocker offering, that just become available today.

    Docker containers get persistent with Flocker by ClusterHQ.

    So let’s dissect the press release in our custom style (the press release can be found here):

    SAN FRANCISCO – June 17, 2015 – ClusterHQ, The Container Data People, today announced the general availability of Flocker 1.0, its container data management software. By enabling stateful Docker containers to be easily moved between servers, Flocker facilitates widespread production deployment of containers for databases, queues and key value stores. Modern applications are being built from both stateless and stateful microservices; Flocker makes it simple and practical for entire applications, including their state, to be containerized to take full advantage of the portability and massive per-server density benefits inherent in containers. This operational freedom allows DevOps organizations to increase the value of their Docker investment and opens the door for containers to be used in a greater variety of mainstream enterprise use cases in production. Flocker 1.0 is available as an Apache-licensed download at

    MyPOV – Nice description of what Flocker does, making the leading container service, Docker, stateful. Good to see support for databases, but also queues and value stores. Many next generation application use cases go beyond traditional databases and use queues and / or (named) value stores. Rightfully ClusterHQ stresses that more use cases become possible now for containers. And no surprise, Flocker is open source, Apache license.

    “Containers are emerging as one of the most important innovations in modern computing since virtualization. Forward-thinking developers and organizations are latching onto this phenomenon for good reason. Containers are revolutionizing how microservice-based applications are built and operated, by delivering orders of magnitude better density, and unprecedented portability of applications. By making it possible for containers and their data volumes to be moved in production IT environments, innovative vendors are enabling the business benefits of containers to reach even further, removing barriers to Dockerizing everything,” noted Holger Mueller, VP and principal analyst, Constellation Research.
    MyPOV – Usual quote – no comment needed. [I have to add ‘dockerizing’ to my regular vocabulary though.

    Empowering portability of containers and their data as a single unit, a prerequisite for many production uses, Flocker lets DevOps teams easily containerize their stateful microservices, thus consolidating their entire distributed application into an all-Docker development and operations environment. With everything running in containers, IT operations can be simplified into a unified set of operational processes. Moreover, making it easy to containerize stateful microservices decreases costs so that far more applications can be run on a given set of hardware. Because Flocker enables easy migration of stateful containers, organizations now have the flexibility to accommodate common IT processes such as routine maintenance and load balancing of workloads across servers. The downstream impact is meaningful in today’s real-time economy: businesses can innovate faster and become more responsive to their customers.

    MyPOV – In most cases we have seen development organizations shying away from stateful use cases for containers. Those who pushed onwards had to orchestrate complex operations to coordinate code (in containers) and data (in a variety of storage formats) with elaborate DevOps mechanisms. And most were so complex that they hurt what most next generation applications are all about – elasticity of resources. Now there is an option to get both handled in a single product / construct with Flocker.

    “Organizations use Docker to accelerate their application development lifecycle and achieve frictionless portability of their distributed applications to anywhere. ClusterHQ identified early on that running stateful services in containers would help drive Dockerized applications more rapidly into production. Their technology helps organizations that want to take advantage of Docker’s benefits for stateful as well as stateless applications.” said Nick Stinemates, head of business development and technical alliances, Docker, Inc.

    MyPOV – Good for Flocker to get a quote from the 800 pound Gorilla of containers, Docker. Begs the question what Docker’s plans are here – but for now it looks like a good (informal) partnership as we often see these days.

    Flocker provides an API and CLI for managing containers and data volumes, and works with multiple storage solutions including Amazon EBS, Rackspace Cloud Block Storage, any OpenStack Cinder-compatible device, EMC ScaleIO and EMC XtremIO. The pluggable nature of Flocker is designed to work with any storage system, delivering the ability to integrate dockerized applications with existing storage backends. Any company or community that wants its storage to work with Docker can easily write a driver for Flocker.

    MyPOV – Good to see a modularized, extensible architecture of Flocker, with pluggable drivers for different storage systems. And for those left out right now – though the initial scope delivered by Flocker is an impressive first release – they can build their own drivers. A good showcase for openness and dynamics of the open source ecosystem.

    “The efficiency and operational freedom of using containers for both compute and storage create a competitive advantage so significant that smart organizations are seeking ways to containerize more of the applications that are strategic to their business. The ClusterHQ mission is to make it as easy to containerize data as it is to containerize compute,” said Mark Davis, CEO of ClusterHQ.

    MyPOV – Key quote with the ClusterHQ mission – make it as easy to containerize data as it is to containerize compute.

    In addition to this news, ClusterHQ today announced a partnership with EMC, as well as the compelling findings from a recent third party survey regarding current and planned adoption of container technology across organizations of all sizes. To learn more about the EMC partnership; to access the survey and report visit: […]

    MyPOV – Good to see ClusterHQ getting the interest of storage heavy weight EMC. And likewise good to see that EMC is looking at this dynamic ecosystem, which could disrupt how the storage market operates.

    Overall MyPOV

    Good to see a vibrant microservices ecoystem. Even better to see vendors like ClusterHQ tackling tough and crucial capabilities that expand the use cases that can be automated with container based next generation applications. From my (more enterprise software formed) experience and network, the majority of use cases are in a persistent world. If you can't make a business process persistent, it may well not have happened. So making containers persistent is probably an increase of 3-5x uses cases that can be addressed by containers that have persistency.

    On the concern side, this is early days, not a trivial problem to solve, so it will be key to see first use cases and the success of early adopters. And it needs to work, reliably, 24x7. It also raises the ante for containers themselves - because users cannot just start another 'container engine' to replace one that has gone bad, but if a container crashes, state is lost. And that is not a good outcome for most use cases.

    But overall congrats to ClusterHQ for providing a key capability to Docker with Flocker. Major step, we will be watching the adoption. 

    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    Earlier this week Oracle put the world through a ‘monster’ 5 hour online event updating on its cloud strategy – with a focus on the IaaS, PaaS and DaaS aspects, SaaS was quickly presented by Ellison, but I guess more will come at a later time in the year (OpenWorld?). The event was planned since a long time, so any speculation that it was a reaction to Oracle recent Q4 / FY 2015 earnings does not bear much substance. But it was overdue for Oracle to clarify what is has done, plans to do and will do for the ‘lower’ layers of its cloud strategy. 

    So here are my Top3 takeaways (the press release can be found here):

    Simplification– in the past Oracle pegged its products to IaaS / PaaS etc. – now we have the Oracle Cloud Platform (for PaaS) and the Oracle Cloud Infrastructure Services (for IaaS). A good top level naming convention, but then the same, massive offering lies underneath. But it is good to see that the official Strategy for Oracle Cloud Platform is to bring Oracle’s Database and Middleware offerings together for the ‘customers and partners anywhere in the world through the Internet’. Note the internet – more below. It is somewhat simpler for the Oracle Infrastructure Services portfolio – that offers infrastructure services for workloads is an ‘enterprise grade cloud managed, hosted and supported by Oracle’. And these are the 3 basic computing services, compute, storage and networking. As mentioned in the progress report of the cloud analyst summit in spring (see here) that is the area where Oracle is relatively most behind, compared to PaaS and SaaS. 
    Enterprise Software Musings - Oracle PaaS Customers
    Oracle PaaS Customers

    The interesting aspect on Oracle Cloud Platform is the ‘through the internet’ addition in the strategy charter above. Oracle delivers one of the few offerings that can be deployed both in Oracle’s cloud infrastructure as well as on premises. In the latter the support, upgrades and maintenance is delivered through the internet. Oracle keeps stressing that the same code runs on both sides, workloads can be moved transparently as customers wish. This remains a key differentiator of Oracle towards most cloud offerings in the market. It will be good to see some proof points of customers taking advantage of this capability, especially for the hybrid usage of the offering. What we see right now that it is more of a binary (on premises vs cloud) decision in sales conversations and deployment decisions.
    Enterprise Software Musings - Oracle PaaS Momentum in Q4 2015
    Oracle PaaS Momentum

    GA of 6 cloud Services– The main message from product side was that 6 Oracle cloud services are now available. These are Oracle Database Cloud – Exadata Service, Oracle Archive Storage Cloud Service, Oracle Big Data Cloud Service and Big Data SQL Cloud Service, Oracle Integration Cloud Service, Oracle Mobile Cloud Service and Oracle Process Cloud Service. All six are major achievements for Oracle, with making the Oracle database available on Exadata machines in the Oracle cloud being a potential cannibalization of Exadata hardware sales, so quite remarkable. And Oracle introduced its competitor to Amazon AWS Glacier with Oracle Archive Storage Cloud Service, not surprisingly at a lower price point, but it wasn’t clear if services were 100% identical. The two BigData Services are interesting as they move BigData usage closer to business users, which overall is the right strategy direction for all enterprise services. The same is valid for Oracle Mobile Cloud Services that allows business users to build (simple) cross platform mobile applications, a product that once fully working and adopted, will change the way how mobile applications are built. And Oracle Integration Cloud Service is interesting for the same approach, bringing the tools of enterprise software integration to business end users. An ambitious goal, but Oracle has made steps close to achieve that goal. 

    Enterprise Software Musings - Oracle PaaS Announcements
    All PaaS Announcements 

    But there was more – in total Oracle announced 23 PaaS Services, some key networking services to make the Oracle vision a reality (Site to Site VPN, Direct Connect), a lot around polyglot / multi language capabilities of the Oracle cloud (next to Java, Node.js and Ruby), but the most interesting ones to me remain the business end user enablement services, around Application Builder, Data Visualization, Big Data Preparation and Discovery and a ‘make sense of IoT data’ service. While a number of the announced services are basic enablers, some even table stakes and others are bringing Oracle products to the cloud, the real differentiators going forward lie in the business end user enablement services.

    Enterprise Software Musings - Oracle GB / Month Comparison Costs
    Oracle comparison on GB / month

    It’s all about TCO– We have written before that at the core of Oracle’s organization DNA is TCO savings, starting with the first product reducing database costs with the relational database. For Oracle in 2015 it means engineering the whole technology stack together, in order to achieve lower cost for performance. And while Oracle did not offer any specific savings beyond storage (Oracle claims to be cheaper than Amazon AWS, EMC and IBM) at the event, the recent announcements of Oracle getting into the two socket server market, shows the power of the approach (we covered it here). What was remarkable there is, that Oracle did not want to be in ‘commodity hardware’ – but changed its strategy (still wondering and speculating why) and then applied the integrated technology stack strategy and approach to the problem. The result is a significantly lower TCO for the Oracle two socket server systems (compared to other market leaders), so the strategy even delivers in areas that were not original targets. A good capability to have for any technology vendor in the fast shifting technology markets.

    Overall MyPOV

    Another Oracle cloud event with a lot of announcements, but also 6 PaaS cloud services in GA. Oracle did not mention when all of them would be GA (or I missed it in the 5 hours, sorry) , so that is good progress – but can’t hide over the fact that Oracle is ‘still getting there’. What is impressive is to see what kind of revenue Oracle can generate already based on this early offering, the vendor claims to have had the most record ‘cloud’ quarter in the industry with its Q4, with 400+M in new subscription revenue. Which makes clear that the game for Oracle is execution both on the product and on the sales side.

    On the concern side, Oracle has to build and deliver a lot. One of the largest engineering projects on the planet, if not the largest. And due to the integrated nature of the offering, more bits and pieces have to work flawlessly together. Anyone with experience in enterprise software knows that harmonizing the release of two products is more often a challenge than not. Oracle is releasing multiple 100s that need to work together. And Oracle is building its cloud ‘top down’ – having more mature SaaS than PaaS, more mature PaaS than IaaS products (or replace mature with available). In the ideal world you would go IaaS to PaaS to SaaS. But that’s a place where Oracle finds itself for historic reasons, now it needs to retrofit its SaaS offerings to take advantage of its new PaaS and IaaS capabilities. Take the very promising Oracle Integration Cloud Service, which will have out of the box integrations, built by the SaaS teams. So interfaces are being rebuilt, re-tested etc. Ultimately Oracle has the experience and resources to get all done, and in the long run it is cutting no corners and doing the right thing, but it takes time, investment and stamina. For now it looks Oracle has all of them.

    On the differentiation side, Oracle is probably the cloud vendor that focusses the most on business end user enablement, a new era of self-service that is enabled by the cloud. With that it changes the way how enterprise software is created, implemented and supported. Less partners and Sis, more business users that build what they need and want quickly. In a world that is accelerating every day, a key capability for business users, who dread nothing else more than going to IT, paying a partner and increasingly cannot afford to wait for days to go live because some software had to be changed somewhere.

    The other differentiator for Oracle is the completely integrated technology stack, where it has a quite unique position in the market. Oracle needs to show the advantages of this to decision makers, who traditionally are concerned about too much tie in, but on the flip side want to avoid too many integration worries. TCO is where decision makers and Oracle can come together on this, and given Oracle’s TCO driven corporate DNA, Oracle is well positioned for that conversation. We saw glimpses of that at the event when it came to storage costs and number of steps it takes to get a product up and running. These stats are interesting but need customer backup sooner than later.

    A compelling vision and some good early proof points for Oracle, but it is early, key capabilities need to be made available not only announced, so it is execution and then customer adoption time for Oracle. We will be analyzing.


    More on Oracle:

    Future of Work / HCM / SaaS research:
    • Event Report - Oracle HCM World - Full Steam ahead, a Learning surprise and potential growth challenges - read here
    • First Take - Oracle HCM World Day #1 Keynote - off to a good start - read here
    • Progress Report - Oracle HCM gathers momentum - now it needs to build on that - read here
    • Oracle pushes modern HR - there is more than technology - read here. (Takeaways from the recent HCMWorld conference).
    • Why Applications Unlimited is good a good strategy for Oracle customers and Oracle - read here.

    Also worth a look for the full picture
    • Event Report - Oracle PaaS Event - 6 PaaS Services become available, many more announced - read here
    • Progress Report - Oracle Cloud makes progress - but key work remains in the cellar - read here
    • News Analysis - Oracle discovers the power of the two socket server - or: A pivot that wasn't one - TCO still rules - read here
    • Market Move - Oracle buys Datalogix - moves more into DaaS - read here
    • Event Report - Oracle Openworld - Oracle's vision and remaining work become clear - they are both big - read here
    • Constellation Research Video Takeaways of Oracle Openworld 2014 - watch here
    • Is it all coming together for Oracle in 2014? Read here
    • From the fences - Oracle AR Meeting takeaways - read here (this was the last analyst meeting in spring 2013)
    • Takeaways from Oracle CloudWorld LA - read here (this was one of the first cloud world events overall, in January 2013)

    And if you want to read more of my findings on Oracle technology - I suggest:
    • Progress Report - Good cloud progress at Oracle and a two step program - read here.
    • Oracle integrates products to create its Foundation for Cloud Applications - read here.
    • Java grows up to the enterprise - read here.
    • 1st take - Oracle in memory option for its database - very organic - read here.
    • Oracle 12c makes the database elastic - read here.
    • How the cloud can make the unlikeliest bedfellows - read here.
    • Act I - Oracle and Microsoft partner for the cloud - read here.
    • Act II - The cloud changes everything - Oracle and - read here.
    • Act III - The cloud changes everything - Oracle and Netsuite with a touch of Deloitte - read here

    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    Amazon AWS surprised markets with its announcement to put an AWS region into India in 2016. AWS has been relatively slow to roll out data center regions in the past, partly because they are massive pieces of infrastructure, always have a redundancy site etc.

    The press release can be found here– let’s analyze in custom Constellation style:

    India-based AWS Infrastructure Region will enable customers to run workloads in India and serve Indian end-users with even lower latency

    MyPOV – AWS until now said that locality did not matter given the massive number of Edge locations (50+) AWS operates. The case of India shows that ultimately Edge locations do not help when the latency to the server is too long. And Indian customers have to deal with slow and high latency networks already. Anyone who has downloaded corporate email in Mumbai or Bangalore knows speed is an issue. So good to see Amazon acknowledge this.

    SEATTLE--(BUSINESS WIRE)--Jun. 30, 2015-- Amazon Web Services, Inc. (AWS), an company (NASDAQ:AMZN), today announced that it will open an AWS infrastructure region in India for its cloud computing platform in 2016.

    MyPOV – 2015 sees a departure from the so far practiced policy of AWS to announce regions only when they were going GA, as practice last in Germany (read the launch news analysis here).

    "Tens of thousands of customers in India are using AWS from one of AWS's eleven global infrastructure regions outside of India. Several of these customers, along with many prospective new customers, have asked us to locate infrastructure in India so they can enjoy even lower latency to their end users in India and satisfy any data sovereignty requirements they may have,” said Andy Jassy, Senior Vice President, AWS. “We're excited to share that Indian customers will be able to use the world’s leading cloud computing platform (AWS) in India in 2016 – and we believe India will be one of AWS's largest regions over the long term."

    MyPOV – Jassy nails the issue of data center location, speed and sovereignty requirements. Interesting that he mentions India will ultimately be the largest AWS region in the long term. India certainly has a lot of potential, but one could also interpret that (at this point) AWS does not play any data centers between e.g. Germany and Australia on the Eurasian route between the two locations. Not surprisingly traditional secretive AWS does not mention the location, but we expect it to be in Northern India to reduce not only latency inside of India, but also adjacent geographies.

    Customers in India such as Hike, PayTM, ZEDO, Freshdesk, Inmobi, Capillary Technologies, HackerEarth, Getit, Ferns N Petals, redBus, Druva, Vserv, Hungama, Tata Motors, Jubilant Food Works, STAR India, Future Group, Manipal Global Education, Classle, NDTV, Dalmia Bharat Sugar, Usha International, Macmillan India, Apeejay Stya and Svran Group are already using AWS to drive cost savings, accelerate innovation, speed time-to-market, and expand geographic reach in minutes.

    MyPOV – That is impressive load, and possible a longer customer list (at least what AWS mentioned publicly) then when they opened the German region recently.

    Tata Motors Limited is a leading Indian multinational automotive manufacturing company headquartered in Mumbai, and is part of Tata Group. The company’s customer portals and its Telematics systems, which lets fleet owners monitor all the vehicles in their fleet on a real-time basis, are running on the AWS Cloud. Tata Motors has recently built a parts planning system to forecast spares demand by using ordering and inventory patterns. They use AWS for development landscapes immediately after the project kicks off, which shaves four to six weeks of setup time in a typical project cycle. “Whenever we plan on rolling out a new project or experimenting with a new technology, AWS helps us in quickly provisioning the required infrastructure and enables us in getting up and running at a fast pace,” said Jagdish Belwal, Chief Information Officer of Tata Motors. “AWS has helped us become more agile and has drastically increased our speed of experimentation and therefore, innovation.”
    MyPOV – Good to hear from an Indian marquee brand like Tata Motors. What is not mentioned is that Tata (like all other Indian customers) will be more productive in their new development and test systems thanks to lower latency to the new Indian region. From own experience of switching data centers into India, this can be a substantial and often dramatic performance improvement.

    NDTV is India's leading media house with TV channels watched by millions of people across the world. NDTV has been using AWS since 2009 to run their video platform and all their web properties on AWS. During the May 2014 general election, NDTV using AWS was able to handle the unprecedented web traffic that scaled 26 times from 500 million hits on a normal day to 13 billion hits during election day, and regularly peaked at 400,000 hits per second. “We have been an early adopter of AWS and the benefits that we experience is beyond just cost savings, it is the agility that enables us to move fast with new projects that makes a positive impact and real difference to our business,” said Kawaljit Singh, CTO of NDTV Convergence. “We are very impressed with the staff and tech support teams of AWS, who have been most helpful in providing support and guidance throughout our cloud journey. They worked hand-in-hand with our team so that we are able to handle the massive scale and unpredictability of workloads for the general election event last year, and as a result, the entire process took place without any hitch at all.”

    MyPOV – Another great showcase for benefits of a local Indian region – streaming and media house NDTV. Again a bandwidth hungry and performance critical use case.

    Ferns N Petals is a leading flower and retailer in India with 194 outlets in 74 cities and delivery across 156 countries worldwide. Prior to using AWS, Ferns N Petals was running its IT infrastructure in a traditional datacenter. They turned to AWS in the year 2014 when their business grew rapidly and decided to move their entire online business to AWS. Since moving to AWS, they are able to manage rapid growth of their users’ traffic that peaks at 80 percent during the festive seasons. “Our experience with AWS over the past year has been excellent. AWS is now the cornerstone in our growth strategy, “ said Manish Saini, Vice President of online business for Ferns N Petals. “We have recently launched two new businesses that include new overseas expansion that are all running on AWS. We are now able to spend more time and resources in areas that matter to our customers such as new mobile app development that will enhance their buying experience.”

    MyPOV – Another well picked reference by AWS, Ferns N Petals. Bringing a region to India will not only serve local customers, but also – like in this case – sway them more to the AWS platform as it allows to standardize platforms for global expansion.

    Novi Digital is a wholly owned subsidiary of STAR India, one of the largest media and entertainment companies in India. The company uses AWS to run hotstar, a flagship OTT platform for drama, movies and live sporting events. With more than 20 million downloads in four months, hotstar has seen one of the fastest adoptions of any new digital service anywhere in the world. In fact, during one of the Cricket World Cup matches, hotstar and combined reached a record total of over 2.3 million concurrent streams and more than 50 million video views. “The reliability of the highly scalable AWS cloud platform has enabled hotstar to break many records in the last four months,” said Ajit Mohan, Head of Digital, STAR India. “AWS has been a key partner in helping us deliver a compelling and seamless experience for millions of users.”

    MyPOV – Another media company, always good showcases.

    AWS also has a vibrant ecosystem in India, including partners that have built cloud practices and innovative technology solutions on the platform. The AWS Consulting Partners in Indiainclude Accenture, Blazeclan, Frontier, Intelligrape, Minjar, Progressive, PWC, SaaSforce, SD2labs, Team Computers, Wipro, and many others. Among the AWS Technology Partners inIndia are Adobe, Druva, Freshdesk, Indusface, Microsoft, Newgen, RAMCO, SAP, Seclore and many others. For the full list of the members of the AWS Partner Network, please visit:

    MyPOV – Good to see that the services side is equally engaged in India. And India is always a technology partner / ISV play as we can see. More importantly to be in India.

    Overall MyPOV

    A good move by Amazon, as India is a key location both from a high tech vendor presence, a market and future potential perspective. Additionally latency aspect can be addressed, which play a role in the geographic regions, not to mention data residency and sovereignty implications. From the other major cloud IaaS vendors, only IBM already had a datacenter there. It will be interesting if and how Google and Microsoft will respond. Both have significant load in India. And Oracle and SAP have large ecosystem and development teams in the country, so don’t be surprised they will stake a flag in the ground there, too.

    On the concern side the question is where else should Amazon have invested earlier or is not investing now? Given the vendor departed from announcing a region when up and running (last practiced in Germany), we can probably safely assume no other regions will overtake the India region announcement / go live wise. But you never know. Notable is no Latin American, Mexican, South African and Chinese presence. Russia’s recent data privacy and residency requests make it a more popular location, too.

    Finally curious the announcement was made on the same day as Amazon had the AWS Summit in Berlin. I don't think CTO Werner Vogels spent more than 10 seconds on the announcement. The German audience seemed to care less or at all. Why both overlapped, not clear to me, but 6/30 - last day of the quarter may be a hint for someone financially more astute then me. But overall a good move by Amazon, hitting all the right cords on why to put an instances into India from latency, over data residency and sovereignty as well as a growing and attractive market.

    More on AWS
    • Event Report - AWS Summit San Francisco - AWS pushes the platform with Analytics and Storage [From the Fences] read here
    • Event Report - AWS re:invent - AWS becomes more about PaaS on inhouse IP - read here
    • AWS gives infrastructure insights - and it is very passionate about it - read here
    • News Analysis - AWS spricht Deutsch - the cloud wars reach Germany - read here
    • Market Move - Infor runs CloudSuite on AWS - Inflection Point or hot air balloon? Read here
    • Event Report - AWS Summit in SFO - AWS keeps doing what has been working in the last 8 years - read here
    • AWS  moves the yardstick - Day 2 reinvent takeaways - read here.
    • AWS powers on, into new markets - Day 1 reinvent takeaways - read here.
    • The Cloud is growing up - three signs in the News - read here.
    • Amazon AWS powers on - read here.

    Other cloud related:
    • Musings - Are we witnessing the rise of the enterprise cloud? Read here
    Find more coverage on the Constellation Research website here.

    0 0

    We had the opportunity to attend the analyst summit for IBM’s cloud efforts, held last week in New York at the brand new and beautiful IBM Cloud and Watson HQ at Astor Place. The event was very well attended, despite the summer season and the Friday date, showing the interest that IBM has gained in cloud matters in the analyst community. 

    We learnt a lot of things, tough to distill the Top 3 – but here you go:

    Messaging Improvement– Discounting transcontinental travel and a hot day in New York City, I was impressed by the improved messaging by IBM executives, listening to LeBlanc, Rippert et al. It looks to me that the formation of the new Cloud Business Unit has helped, also that IBM has hired a number of key executives from the outside. So it’s no longer about speeds and feeds, but about business model change, creating differentiation for customers, enabling new business models, understanding the power and relevance of open source as a platform, and enabling enterprises to build next generation applications with a modern PaaS, Bluemix. It is still early days, but IBM can speak more about customers using this than ever before, we were walked through about a dozen customer scenarios and references as the day progressed. The good news is not only to have these customers stories, but getting uptdates about their usage of the IBM cloud product and equally learning about broader initial uptake of the IBM cloud product in new accounts. 
    Leblanc opens the IBM Cloud Analyst Summit in NYC
    LeBlanc opens the Cloud Summit

    Hybrid is on the rise– We have advised, talked, presented and blogged about 2015 being the year of the rise of the hybrid cloud in terms of product enablement. And no surprise IBM is no exception, with most of Sabbah’s presentation focused on the topic. IBM plans to deliver a product (I guess for now) called the ‘Hybrid Controller’, which is tasked to combine on premise and cloud resources securely and reliably across different data centers. It is clear that IBM hears from its customer that pure public cloud is not the immediate path and desire for all of them, so for IBM to enable to play on all angles of the cloud economics spectrum, from value on traditional servers in traditional data centers, over private, to dedicated (aka managed) to public cloud is key. Being able to do this across public clouds (Amazon AWS, Microsoft Azure were logo mentioned on the slides - see below) is important for IBM to keep its position as a trusted advisor of the CIO, beyond only IBM product and brands. Finally the acquisition of BlueBox (see my News Analysis here) is a key enabler in the immediate future for IBM to enable the hybrid cloud reality.

    The IBM Hybrid Cloud vision
    The IBM Hybrid Cloud vision 

    Bluemix more and more in the mix– It is becoming clear how essential Bluemix is for the IBM cloud strategy, and that starts with adoption. Robinson shared uptake numbers of Bluemix and given the product is only officially 1.5 years old – they are notable: Bluemix is the largest CloudFoundry deployment on record, and is adding 10k users every week. In April Bluemix logged 1B API calls, an impressive number. But IBM still wants to see even more and has put social media / community veteran Carter on the community outreach for Cloud / Bluemix. IBM is also doing well on the partner front with Twitter, CitiBank, Box, Apple and NASA, as well as very good system integrator uptake, with Accenture, CapGemini, Deloitte, EY, KPMG, PWC, TechMahindra and Wipro being mentioned.

    1.5 years of IBM Bluemix PaaS
    1.5 years of Bluemix


    IBM momentum keeps growing, as IBM keeps investing into datacenter locations, additional software capabilities and ecosystem building. It looks like the organization with a dedicated cloud group under LeBlanc is paying off, as the team is in place, seems to be working well together, and most importantly is focused only on the IBM cloud success.

    On the concern side, IBM is a cloud player with fewer ‘organic’ load than competitors. IBM reminds us of the 1000 SaaS properties that are in the IBM fold, but that’s not enough to reach the economics of scale that are needed to succeed long term in the cloud. When I asked LeBlanc on this, it was good to see that IBM has realized this and sees them main growth of load coming from Strategic Outsourcing deals. IBM has signed a number of those in the last quarters, but will need to push the gas pedal down even more, as traditional (hardware) competitors Cisco, Dell and HP get their cloud game (finally) together and will create more competition. But it is good to see that IBM executives are aware of the challenge and are addressing it.

    Overall a good event for IBM, that is focused and investing into cloud and has some key differentiating assets over the competition, starting with more data center locations around the world, an enterprise grade PaaS product with Bluemix and attractive differentiators in cognitive computing with Watson, Analytics, BigData and Security. And last but not least IBM has access to almost all CIOs around the world, so it will be interesting to see how well IBM can leverage that going forward. We will be watching.

    • Market Move - IBM gets into private cloud (services) with Blue Box acqusition - read here
    • Event Report - IBM InterConnect - IBM makes bets for the hybrid cloud - read here
    • First Take - IBM InterConnect Day #1 Keynote - BlueMix, SoftLayer and Watson - read here
    • News Analysis - IBM had a very good year in the cloud - 2015 will be key - read here
    • Event Report - IBM Insight 2014 - Is it all coming together for IBM in 2015? Or not? 
    • First Take - Top 3 Takeaways from IBM Insight Day 1 Keynote - read here
    • IBM and SAP partner for cloud - good move - read here
    • Event Report - IBM Enterprise - A lot of value for existing customers, but can IBM attract net new customers? Read here
    • Progress Report - The Mainframe is alive and kicking - but there is more in IBM STG - read here
    • News Analysis - IBM and Intel partner to make the cloud more secure - read here
    • Progress Report - IBM BigData an Analytics have a lot of potential - time to show it - read here
    • Event Report - What a difference a year makes - and off to a good start - read here
    • First Take - 3 Key Takeaways from IBM's Impact Conference - Day 1 Keynote - read here
    • Another week and another Billion - this week it's a BlueMix Paas - read here
    • First take - IBM makes Connection - introduces the TalentSuite at IBM Connect - read here
    • IBM kicks of cloud data center race in 2014 - read here
    • First Take - IBM Software Group's Analyst Insights - read here
    • Are we witnessing one of the largest cloud moves - so far? Read here
    • Why IBM acquired Softlayer - read here
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    Workday earlier today informed the markets via a press release that it is expanding its footprint in healthcare with a functional extension that goes beyond its traditional HCM and more recent efforts in Financials, with the addition of inventory management.
    So let’s take the press release apart in our usual commentary style (it can be found here):

    PLEASANTON, CA -- (Marketwired – July 9, 2015)Workday, Inc.(NYSE: WDAY), a leader in enterprise cloud applications for finance and human resources, today announced plans to deliver a new application, Workday Inventory, as well as new features for Workday Procurement – designed to meet the supply chain management needs of healthcare providers. Combined with Workday Financial Management and Workday Human Capital Management (HCM), this new supply chain management functionality will equip healthcare providers with one system that offers visibility into talent needs, the flexibility to quickly adapt to new regulations and industry standards, and the ability to manage inventory and supplier interactions. Workday plans to design the expanded suite based on feedback from charter members of the Workday Healthcare Advisory Council, including Christiana Care Health System and Community Health Services of Georgia.

    MyPOV – Workday extends its capabilities with Workday Inventory, tuned to the needs of the healthcare industry. Healthcare has been a strong industry for Workday before already, so this addition will help the vendor, its customer and prospects in this vertical. The Healthcare industry has some specific inventory requirements (expiry, storage requirements, access, replenishment etc.) so it is a good industry for Workday to pick as it offers enough differentiation, but then it is not a full-fledged, horizontal SCM capability.  

    Extending the Power of One System for Healthcare Providers

    Healthcare providers – such as hospital systems, academic medical centers, and long-term care groups – are faced with dramatic industry changes including widespread industry consolidation and new business models which are forcing them to rethink their talent and organizational needs. Current systems are inflexible and costly to maintain, making it difficult for them to adapt to changes, and get the insights they need to move their organizations forward. 

    MyPOV – This is a nice way of Workday saying that many of the incumbent systems in the healthcare industry are older, not cloud based systems. This is an opportunity for Workday and competitors to replace many of these older systems, but healthcare providers don’t want to break existing integrated automation assets for the sake of modernization, so they may have well asked Workday (and others) to step up roadmap plans for Inventory / SCM.

    With Workday, healthcare providers are already equipped with one system that:

    • Adapts to Change: Customers are able to navigate reform-driven changes, mergers and acquisitions, and organizational restructuring via a flexible business process framework. 

    • Delivers a Comprehensive View of Talent: With Workday, customers are able to better attract and retain talent through powerful recruiting tools, rapid onboarding, performance and talent management, and insights to help increase retention. 
    • Provides Visibility to Drive Growth: To continually drive operational efficiency and business growth in a dynamic industry, Workday offers analytics to monitor the financial performance of different business models – including new service lines such as outpatient clinics – and insights into developing talent.

    MyPOV – This is the overall Workday value proposition. Workday is doing a good job at selling the ‘higher’ ground with its core strength of being cloud based and key HCM assets.

    The planned introduction of Workday Inventory and new Workday Procurement features will give healthcare providers the supply chain foundation they need to:

    • Closely Manage Supplier Relationships: With Workday, customers will be able to manage supplier and group purchasing contracts, track inventory, and automate replenishment to reduce costs. In addition, they will be able to more clearly understand detailed supply utilization and costs to make better contracting and standardization decisions.
    • Drive Sustainable Cost Savings:  Workday will enable customers to increase compliance with standard purchasing processes such as applying requisitioning and approval controls. In addition, purchasing teams will have one centralized view of spend by category that they can monitor and analyze to identify opportunities for cost savings. Mobile access, real time analytics, and the flexibility to configure procurement practices based on an organization’s preference will also allow for quicker decision making and adaptability.
    • Effectively Manage and Cultivate Supply Chain Teams: With Workday, customers will have a comprehensive view of talent, and insights on how supply chain teams are performing against business goals. Additionally, supply chain leaders will be able to better develop their teams, measure talent utilization, and cultivate employee skills to better serve the needs of patients and clinicians.

    MyPOV – Workday already had Procurement functionality as part of Workday Financial Management (see here), but that was more of a horizontal self-service purchasing functionality that does not need to have inventory at hand. Employees do order as they have need and contracts, approvals, budgets, suppliers etc. are managed to make it happen. For Healthcare a more capable inventory functionality is needed (see e.g. above). Again Workday does a good job at what all suite level vendors do – leverage existing automation assets to manage supplier relationships from Workday Procurement, mobile access, analytics and the whole HCM aspects for talent, goals etc.  


    Workday plans to make Workday Inventory generally available to customers in September 2015. Healthcare-related features in both Workday Inventory and Workday Procurement are scheduled to be delivered in calendar year 2016.

    MyPOV – Close deliver dates are always a good thing, so kudos for a fairly near availability horizon. It makes sense to deliver general capabilities before vertical ones, but interestingly the press release is bare horizontal implications, what can e.g. a non-healthcare customer / prospect of Workday expect from Workday Inventory?


    Overall MyPOV

    Workday keeps extending its automation portfolio capabilities. It’s another data point validating that Workday is in the game for a suite of enterprise functions, not only beyond HCM, but also HCM and Finance. It makes sense for Workday to limit functionality to specific verticals, and verticals that the vendor is strong on already, like Healthcare. It’s too early for Workday to compete in breadth and depth with full-fledged SCM / Purchasing automation vendors. So certainly a valid and viable strategy. It is also another data point that Workday plans to go to market more from a vertical approach, this press release mentioning and focusing Healthcare. That makes sense as it allows to build vertical functionality and messaging, but has product implications.

    On the concern side it is one more investment area for Workday, no matter how small and well defined it may be. With new automation come commitments to roadmap, support, maintenance and education. Expanding into SCM also requires Workday to ramp up go to market efforts and expertise, including a partner ecosystem that is different to its existing more service industry and HCM / Finance oriented portfolio, so more investment is needed and Workday already needs to invest on many fronts. And personally I would like to see Workday to do more on HCM, but I am sure the executives in Pleasanton are keeping a watchful eye on Workday remaining competitive in that key area. And it needs to keep that functionality (very) compelling – as it keeps using it (rightfully) to get product efforts like (today’s example) off the ground.

    But overall it is always good to see vendors grow their automation portfolios, as it creates value for customers and prospects. So congrats to Workday for moving deeper into SCM, with the addition of healthcare related inventory capability and new features for Workday Procurement. We will be watching.

    For more closer SCM coverage, follow my colleague Guy Courtin here.  

    And more on Workday

    • News Analysis - Workday supports UK Payroll - now speaks (British English) Payroll  - read here
    • Workday 24 - 'True' Analytics, a Vertical and more - now needs customer proof points - read here
    • First Take - Top 3 Takeaways from of Workday Rising Day 1 Keynote - The dawn of the analytics era - time to deliver Insight Apps - read here
    • Progress Report - Workday supports more cloud standard - but work remains - read here
    • Workday 22 - Recruiting and rich Workday 22 are here - read here
    • First Take - Why Workday acquired Identified - (real) Analytics matter - read here
    • Workday Update 21 - All about the user experience and some more - read here
    • Workday Update 20 - Mostly a technology release - read here
    • Takeaways from the and Workday parnership - read here
    • Workday powers on - adds more to its plate - read here
    • What I would like Workday to address this Rising - read here
    • Workday Update 19 - you need to slow down to hurry up - read here
    • I am worried about... Workday - read here

    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    We had the opportunity to attend the NGA HR (formerly known as NorthgateArinso) analyst summit in Chicago. Despite the summer date, the event was well attended by the analyst community.

    So here are my top 3 takeaways:

    Positioning gets clearer– For the longest time NGA has been struggling with positioning itself in the different HR markets where it operates. The one stop shop for all things HR seems like a simple message at first, but is not so trivial when you consider that NGA plays in HR Consulting, Payroll Services, HR Outsourcing and Application Management Services (the maintenance of mostly local HR systems). Then spread that across multiple partner and own products, multiple regions and it makes NGA a different vendor in each market. The overall challenge that NGA faces is that HR Consulting, that is implementation services, is undergoing a massive transformation due to the shift to SaaS. Implementation timelines, budgets and revenues are under immense pressure, very few traditional on premise implementations remain and SaaS implementation revenues are very small in comparison.

    The good news a year ago was that management acknowledged the challenge, the strategy then was to partner with SaaS properties and become their respective implementation partner. Fast forward 12 months and only the partnership with SuccessFactors has been flourishing, implementation partnership ambitions with other SaaS vendors have been scaled back or stopped.

    The challenge for NGA is a shrinking HR Consulting business in the short term and in the long term the deriving reduction of the Application Management Services. Maybe making a virtue out of necessity, SVP of Global Enterprise Sales Sternklar made it crystal clear (pun intended) that the future will be in HR Outsourcing, heavily global payroll focused. That’s welcome clarity that should help position NGA in many markets.

    NGA does HR Consulting, HR Outsourcing, BPO, Application Maintenance
    The 4 NGA Business Fields

    Payroll Exchange live– A year ago NGA shared its ambitions around the Payroll Exchange, a software layer that allows NGA to operate global payroll customers in an efficient way. Not only does Payroll Exchange connect between NGA products (e.g. euHReka, ResourceLink and Preceda), but also to partner products (e.g. Workday Payroll, the SAP Cloud Payroll) and further 3rd party payrolls. It also gives visibility and access to payroll status and processes to customers, a very good level of transparency, especially if considering running payroll for a global employee population. Payroll Exchange is now available since January, 2 NGA customers are live and many more implementing. It is good to see NGA delivering on what it said it would deliver, now it needs to drive adoption to the platform. 

    NGA Payroll Exchange Analytics #NGAHRSummit #BPO
    A glimpse at Payroll Exchange Analytics

    US targets mid-enterprise– NGA walked us through its renewed plans for the US SMB market, which the vendor calls mid-enterprise. It has pained NGA since many years that it did not have a solution for the considerate amounts of employees it needs to service in the US as part of global BPO contracts, but that by themselves were too small to warrant an individual setup of a euHReka system. So NGA has decided to use either the Workday or SuccessFactors Talent Management, partner with a local payroll provider and offer this as a mid-enterprise solution. Such an offering needs amounts of standardization, and NGA decided e.g. that paychecks would only be electronic, the language is only English etc. to share a few. Executives shared that there is substantial interest in the market, but it is too early to see customer adoption.


    It is not the easiest of time for NGA with two of its four businesses undergoing major market changes. We think the focus on global HR, complex payroll for global employee populations and deliver via BPO remains the sweet spot for NGA. It is good to see more focus on that business. We think NGA could be more aggressive in this market, and should start taking market share from competitors who are less committed since a few years. NGA certainly had to wait for the Payroll Exchange product to be ready, now it will be interesting to see how aggressively NGA can sell, deliver and operate global BPO.

    We received a more North American update than at least year’s summit, but NGA promised to give us more insight on its business beyond North America, and more details on the roadmaps of its IP around euHReka, ResourceLink and Preceda. On the concern side NGA in under pressure to grow, and it will be key for the vendor to find those growth potentials and realize them in the next quarters. Prospects, clients, ecosystem, influencers and employees don’t want to see shrinking vendors, and frankly there is no reason for NGA not to grow – or at least maintain revenue levels.

    Overall good progress for the North American region that this analyst summit was all about. But given its global capabilities and ambition, it will be good for NGA to have their global leadership present at the next analyst summit (again). Regardless we will be watching on developments, stay tuned. 

    More on NGA HR:

    • Progress Report - NGA moves on - but in many directions - read here
    • NGA executes - but what is the ultimate positioning? Read here
    Here is a quick Storify Tweet collection of the event:

    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    We had the opportunity to attend the CapGemini global analyst summit held close to Paris at the beautiful conference hotel LesFontaines. A great location always contributes to a great meeting and that was the same here, it had the unique mixture of a classic chateau with a modern conference center, a balance the French know how strike very well. Due to thunderstorms (!) I was 8 hours late to the event and missed the first half day of the event, so checkout my colleagues Alan Lepofsky and Guy Courtin takeaways (Guy’s is here already) for more color and perspective on the event.

    It is always tough to select the Top 3 takeaways, but with global system integrators like CapGemini, it is even harder, nonetheless my best effort here:

    CapGemini is in transition – It became clear that CapGemini is not only in the transition all major services players see –shifting client needs, faster implementation, more partnership, globalization, and more – but is also transforming itself from a Europe an player with a little global activity to a more global player with a European DNA. E.g. in 2010 European revenue was 79%, in 2015 it was down to 62%, with North America taking the bulk with 30% (these are numbers before iGate). As such CapGemini brings a unique European DNA to the mix, with all its pros and cons. On the pro side it does not have to chase every latest trend in North America and can take a more wait and see European approach. And it truly understands global, that for some competitors is more lip service than understanding. On the con side it has a more rigid skill base. It was good to see that CapGemini acknowledges the issue and had Head of People Management and Transformation Hubert Giraud speak to the analyst attendees. The training and re-skilling plans and progress are interesting and look successful, but they better be as CapGemini needs to work with the people they have in Europe for the next decades to come. 

    LesFontaines Paris Chateau by Alan Lepofsky @Alanlepo
    Unique Setting at LesFointaines (picture credit to +Alan Lepofsky)

    The Cloud is coming– CapGemini sees the cloud as the most profound technology shift hitting enterprises. CTO Cohen walked us through how CapGemini wants to leverage cloud in the form of a combination of products and services that CapGemini calls the Business Cloud. CapGemini sees itself offering a complete cloud implementation and management portfolio from Cloud Strategy and Workload Migration all the way to BPaaS and CyberSecurity. CapGemini then walked us through an example vertical, financial services, on how it has offered these services successfully to a number of references. The example were a regional government and a postal delivery serviced, which both have benefitted from the CapGemini Business Cloud. Remarkably one of them is live on Amazon AWS already. As usual with system integrator events, the customer references were very good, talking about their experiences with ease and authority, as usual presenting with a CapGemini partner. 

    Paul Hermelin Cap Gemini Les Fointaines June 2015 CEO
    A very relaxed Hermelin addressing the analyst crowd
    Ready for customers – but ready to transform itself?– It was good to hear CapGemini executives candidly and openly speaking about the challenges ahead, actually quite remarkable given the cultural background and the usual reclusiveness of system integrators. It was pretty clear that CapGemini has the skills, understanding, expertise and vision to play with all next generation application use cases of the 201x years (Digital Transformation, Customer Experience, IoT, BigData etc.) – but the integrator still sees that very much from a services angle. In some areas we have seen CapGemini making bets on products, for instance IaaS (e.g. Amazon AWS), PaaS (e.g. IBM Bluemix and SaaS (e.g. on Salesforce), but it seems not be fully on board with them, the service provider mentality is still strong (‘we can make you successful on any product / platform / tool’ mentality). Nothing wrong with that, as it has served the industry well for many decades, but things are changing in the 201x years, where customer want to have ready to use, pre-built solutions beyond the ‘prebuilt IP’ toolset. We live in exciting times, where business best practices have to be redefined and reinvented on top of the new technology capabilities available now – but we are not sure how well CapGemini is going into more of a product direction than the classic services business. Future will tell.

    CapGemini iGate synergies system integrators
    How CapGemini and iGate complement each other


    A very good analyst event, in a very nice setting, with a ton of content and very good attendance. It is clear that CapGemini can be the right partner for enterprises facing strategic transformation projects – in all aspects of use cases that are thrown towards them (see above). And CapGemini is very well positioned to take advantage of a multi-national and multi-cultural perspective, given its European roots. At the same time the integrator is becoming more global, in particular more North American and more Indian with the iGate acquisition (it just closed – see here).

    On the concern side CapGemini needs to become more global, successfully re-skill its European workforce and find more productized go to market offerings. It is good to have identified over 4 dozen IP assets, now they need to be built out into product, something system integrators have traditionally struggled with, but the prize is too big in the 201x years for trying (at least). We would like to see CapGemini make a few full hearted (real) product bets. Customers want faster, shrink-wrapped, business user driven solutions that do not require RfPs, project teams and people travelling the world in small boxes.

    Overall CapGemini is ready for the Digital Economy, as ready as a system integrator can be in 2015, with all the questions that we are facing in the age of uncertain business best practices. It is fast paced and exciting times, we will be watching.

    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard

    0 0

    We had the opportunity to attend the Berlin edition of the Amazon AWS event series called AWSSummit that is taking place around the world these months. I was interested in the Berlin location to get an impression both on customer interest and readiness for cloud, as well as AWS efforts to get the traditionally skeptical German IT audience to embrace cloud. The event was held at the CityCube location and attended by 2000+ attendees (the first event in Berlin by AWS in comparison, was a small 150 attendee affair only 5 years back, in 2010). 

    Always tough to collect the Top3 takeaways from an event like this – but here is my best attempt:

    Germany and the cloud – warming up? – Germany is an interesting market for IT services, given it is one of the largest economies on the planet. But despite German consumers adopting cloud based services in equal fashion than in other places, German IT has been traditionally skeptical to the point of not adopting public cloud (yet). The NSA / Prism / Snowden affair certainly did not help here, and just recent additional rumors of the CIA spying on more levels of the German government than ‘just’ the chancellor did not help either. In other parts of the world, the consumerization of IT has been the main drivers to explain why cloud adoption has happened quickly in corporate IT, too. Not in Germany, and the main reasons why are in my view existing IT investments, the y levels that allow to keep investing into the ‘old but known and proven’ ways, the high sensitivity to data privacy and protection and finally the sometimes health, sometimes dangerous German stubbornness with ‘not invented here’ incarnation. These German attitudes have largely lead cloud providers to build German data centers, in order to overcome at least the data residency, data privacy and NSA / Patriot Act access fears and concerns. Amazon’s Frankfurt region is available since fall 2014 (read more here) and it is no surprise that Amazon shared at the summit that it is the vendor’s fastest growing region. It is obvious by now that you need to be ‘in it to win it’ in regards of the German public cloud market, so the move by Amazon was a well-timed one. But now it needs to drive utilization to the Frankfurt region – so the AWS Summit in Berlin was a key event for the vendor.

    Amazon AWS Sponsors Summit Berlin
    Sponsors of AWS Summit Berlin
    Showcases to the front– As usual in these situations, where vendors need to attract prospects to a new innovative way of doing IT, showcases are key. Their goal is to show that other enterprises (in this case German enterprises) are using the innovation (in this case AWS) successfully. The Berlin location catered uniquely to the local Berlin startup scene, Amazon was aware that the location and timing of the event were not the best to attract corporations to the event, but the turnout was nonetheless good. And Amazon did well at picking only Germany based showcase customers and managed to cover a wide variety of industries and use cases. The marquee showcase was Audi, which shared that it is building its driver / passenger applications “Audi on demand” (more here) with the help of AWS Cloud. Not the business critical application for a car manufacturer, but a pretty strategic one as these application are expected to work 24x7 and consumers perceive them as an extension of the brand, so they need to be of high quality. Next was German online retailer Zalando, giving a good insight in regards of talent required and organizational changes necessary to be successful in the public cloud. To a certain point Zalando is to AWS Germany what Netflix is to AWS in the US: A competitor deciding to run on AWS Cloud, and as such a powerful reference, an aspect that AWS could have made clearer in Berlin (compared to e.g. Netflix on stage at re:Invent in Las Vegas 2013). Next was a startup Tado, the German version of what the US knows as Nest, that uses AWS Cloud to model BigData and Analytics to better regulate thermostats in a smart way. And finally the Frankfurt Staedelmusem, as a public sector / non profit reference, shared how it is moving visitor education applications and its overall inventory to AWS Cloud. 

    amazon aws cto Werner Vogels Summit Berling organge sneakers
    CTO Werner Vogels (with organge sneakers!)

    Security, Security, did we mention Security? – Next to showcases, vendors need to convince a skeptical audience that you are serious about security. And AWS made it pretty clear, re-iterating many times that ‘security is job #1’ – starting with CTO Vogels and then with a dedicated security track hosted by AWS CISO Schmidt. AWS did a very good job convincing the audience on the advanced nature and sincerity of its efforts. The audience seemed to be unaware of the Key Management capabilities announced back at re:Invent 2014 in Las Vegas, so that was a major take away for many security minded attendees. In my close to a dozen ‘before and after’ polls with attendees, it was clear that AWS has been able to discern the most immediate data security and privacy concerns. Most attendees went from the typical German ‘schaun mer mal’ (let’s have a look) to ‘muessen wir mal anschauen’ (we need to take a look now), which for any connoisseur of the German mindset means a significant change in attitude. Being able to achieve that in one day gives credit to both AWS product capabilities and presenters, who took a humble, informed and competent approach to the presentations.

    CTO Werner Vorgels with the main message - The Cloud is Secure!
    CTO Werner Vorgels with the main message - The Cloud is Secure!


    A good event for AWS in one of the most attractive, but at the same side most skeptical public cloud markets out there. It certainly has created the landmark that the public cloud has not only arrived physically in Germany with the opening of the AWS region last fall, but it is encroaching more and more into the mindset of German IT decision makers. And that traditionally is a slow moving process, with all the pros and cons, but once it is going in the right way, it will go that way for a long time.

    On the concern side, AWS needs to show more direct use cases, and show more of the platform nature of AWS cloud. German enterprises think standards and platforms, and are getting more and more eager to standardize on platforms for use cases, the most prominent in Germany being IoT. All German reservation in regards of public cloud security and privacy are thrown literally overboard when it comes to IoT, as it is clear that the sheer magnitude of application requirements can only be addressed in the public cloud. Quite a turnaround in attitude when the use case changes, the challenge for AWS is that German decision makers keep hearing about ‘ready to use’ platforms from the competition, and the traditional toolbox approach of AWS is perceived as a longer learning curve that also bears some assembly risk. But that is nothing AWS cannot address in the future, but It requires a slightly different go to market the vendor has so far not shown.

    Overall a good event for AWS in Germany, that is clearly in the market for the long run and is doing the basic ground work around public cloud adoption by addressing the fundamental concerns enterprises have with public cloud. AWS is doing and has done the same in all markets where it operates, it just takes a little longer to convince German IT decision makers, but this event was a key step forward in that effort. The good news for AWS is that their German prospects traditionally value the pioneering work of early innovators, even though they do not buy as quickly as e.g. their US based counterparts, but reward them later, as they want ultimately want to be associated with early innovators. So hang in there AWS, the Germans will come, more with IoT than anything else, but once there, they will do it with the famous German Gruendlichkeit (= thoroughness).

    More on AWS
    • News Analysis - AWS learns Hindi - Amazon Web Services announces 2016 India Expansion - read here
    • Event Report - AWS Summit San Francisco - AWS pushes the platform with Analytics and Storage [From the Fences] read here
    • Event Report - AWS re:invent - AWS becomes more about PaaS on inhouse IP - read here
    • AWS gives infrastructure insights - and it is very passionate about it - read here
    • News Analysis - AWS spricht Deutsch - the cloud wars reach Germany - read here
    • Market Move - Infor runs CloudSuite on AWS - Inflection Point or hot air balloon? Read here
    • Event Report - AWS Summit in SFO - AWS keeps doing what has been working in the last 8 years - read here
    • AWS  moves the yardstick - Day 2 reinvent takeaways - read here.
    • AWS powers on, into new markets - Day 1 reinvent takeaways - read here.
    • The Cloud is growing up - three signs in the News - read here.
    • Amazon AWS powers on - read here.

    Other cloud related:
    • Musings - Are we witnessing the rise of the enterprise cloud? Read here
    Find more coverage on the Constellation Research website here.

    0 0

    Every year the Constellation SuperNova Awards recognize eight individuals for their leadership in digital business. Nominate yourself or someone you know by August 7, 2015.


    The SuperNova Awards honor leaders that demonstrate excellence in the application and adoption of new and emerging technologies. 

    In its fifth year, the Constellation SuperNova Awards will recognize eight individuals who demonstrate true leadership in digital business through their application and adoption of new and emerging technologies. We’re searching for leaders and teams who have innovatively applied disruptive technolgies to their business models as a means of adapting to the rapidly-changing digital business environment. Special emphasis will be given to projects that seek to redefine how the enterprise uses technology on a large scale.

    We’re searching for the boldest, most transformative technology projects out there. Apply for a SuperNova Award by filling out the application here:

    Here are the SuperNova Award Categories
    • Consumerization of IT & The New C-Suite - The Enterprise embraces consumer tech, and perfects it. 
    • Data to Decisions - Using data to make informed business decisions.
    • Digital Marketing Transformation - Put away that megaphone. Marketing in the digital age requires a new approach.
    • Future of Work - The processes and technologies addressing the rapidly shifting work paradigm. 
    • Matrix Commerce - Commerce responds to changing realities from the supply chain to the storefront. 
    • Next Generation Customer Experience - Customers in the digital age demand seamless service throughout all lifecycle stages and across all channels.
    • Safety and Privacy - Not 'security'. Safety and Privacy is the art and science of the art and science of protecting information assets, including your most important assets: your people.
    • Technology Optimization & Innovation - Innovative methods to balance innovation and budget requirements.

    And here are 5 reasons to apply for a SuperNova Award:
    • Exposure to the SuperNova Award judges, comprised of the top influencers in enterprise technology
    • Case study highlighting the achievements of the winners written by Constellation analysts
    • Complimentary admission to the SuperNova Award Gala Dinner and Constellation's Connected Enterprise for all finalists (November 4-6, 2015) lodging and travel not included
    • One year unlimited access to Constellation's research library
    • Winners featured on Constellation's blog and weekly newsletter

    And finally - here is last year's winner in the Future of Work category - Robin Jenkins from RMH Franchise - take a look at post and video here.

    Learn more about the SuperNova Awards.

    What to expect when applying for a SuperNova Award. Tips and sample application.

older | 1 | .... | 14 | 15 | (Page 16) | 17 | 18 | .... | 31 | newer