Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog


Channel Description:

... blogging on what is happening in enterprise software, with a focus on Future of Work and Next Generation Applications, sparkled with occasional musings on the the state of the industry and outlooks where we are heading.

older | 1 | 2 | (Page 3) | 4 | 5 | .... | 31 | newer

    0 0

    So Thursday was the final day for both user conferences, that I have been tracking and blogging about this week. If you missed the first day and middle day updates - you can find them here and here. Usually conferences fizzle towards the 3rd day and the NetSuite conference followed this usual trend, SAP's Sapphires are a little different as of the last years as Hasso Plattner waits till the end of the conference to present - which creates all kinds of marketing challenges (see below) - but keeps the suspense up till the last day.

    It's back to SAP to start (as yesterday was NetSuite's turn):


    What's SAP news

    Jonathan Becher seems to have throttled the SAP marketing machine to 4 press releases a day - so like Tuesday and Wendnesday - we got served 4 new press releases today. 

    Needless to say Hana dominated, with the joint press release with H-P as a trusted hardware partners and co-developers of the previously leaked Project Kraken super Hana server. And it's impressive hardware, with 16 E7 CPUs and 12 TB of memory. We will see how successful this machine will be in the more and more crowded Hana space. And next for Hana is the very overdue (in the age of BigData) capability to access Hadoop and other systems, probably all powered by former Sybase code. Given Hana's reconfirmed only RAM as medium for storage puristic strategy, such hybrid approaches are the only way for SAP to allow customers to access the Hadoop world. What this means for performance of queries etc. will be interesting to see. The new geo-coding was more overdue and is a key feature for geography enabled insights. SAP also announced the conclusion of the integration of a sleuth of former Sybase products with Hana - the interesting ones being Sybase ESP, SQL Anywhere and Sybase Replication Server. 

    And remember Business Objects? Well version 4.1 of Business Objects and enhancement has been released  and beyond the revamped Lumira offering, there is support for Amazon Elastic MapReduce and Hadoop Hive, interestingly also better access to Oracle's Exadata, OLAP and Essbase. And SAP keeps investing in Crystal Server, targeted at SMB enterprises. 

    And no large user conference without a reference to the ecosystem, that clearly  - like the SAP itself- is invigorated by Hana. The partners on board for Hana are pretty much the whole ecosystem, services partners as well the hardware partners, obviously. The new twist is that SAP is aggressively targeting re-seller opportunities and the very thriving start-up ecosystem around Hana. Even the ISV partner program, vendors building on top of Hana is at 25 partners. 

    What's NetSuite news


    Three press releases - all about the ecosystem, naming Hero customers and successful partners, as well as 6 partners launching new cloud practices to implement NetSuite products. Netsuite keeps commanding significant interest in the partner space, bot on the services, but also on product partner side, see the coupe from Day 1 of signing up pretty much all mid tier HCM vendors as partners. 

    Hasso & Vishal's Delivery

    It was going to be a long keynote and contrary to old habits, Hasso even started on time.


    Plattner started with addressing what SAP customers (should) expect from SAP applications. And he addressed the needs pretty correctly with a system that is adaptive to business challenges, extensible in data and functionality and gives full access from mobile devices, while being easy to use and offering a consumer grade experience. And true to form and recent communications, the cloud is now ready for (not simple) business applications, analytics, B2B networks etc. But Plattner also said that larger companies will never ever want to share systems and data center space with other companies... well.

    What surprised me was that Plattner then embarked in an almost 30 minutes defense of Hana, seeking clarification on what it really is - SAP's platform of the future. The audience knew that from McDermott and Hagemann-Snabe - just in case it was missed before - but Plattner took the time to defend the little girl. Always debatable if you need to do this in a format like this, well he did - and it proved to be insightful, humorous and even generous towards the competition. 

    In his typical professorial delivery style Plattner debunked multitenancy first. In an interesting change in story line, classic database striped multitenancy like used for applications like SuccessFactors was still a problem for Hana last week (see here) - now it is not - as Plattner said in the press conference we will just use more cores. Something that needs a little more digging going forward.

    Next was the claim, that Hana is not disruptive. The argument is that it's just a change in the database layer. But while SAP deserves credit for trying to make the move of Business Suite users as easy as possible to migrate to Hana - it still means a significant challenge... if switching databases would not be disruptive then SAP and Oracle competitors would have been much more successful with the many campaigns trying to dislodge the Oracle database under the SAP applications.

    Then we were at the point that Hana cannot be successful, as it supports only RAM as storage medium. Plattner argued that only the data that is used and needed will move to Hana (and into RAM), not used data won't. And then there is compression on top. And while you certainly won't need all 500+ fields of an order line - SAP needs to explain how to identify and migrate the used columns. And how to add further ones when usage changes.

    It was also clear, that Plattner sees Hana as the chance to slim down the massive R/3 DNA in the SAP applications. Forget about columns, we don't need all this functionality, we don't need long running queries... to the claim that there wasn't long running batch files when he was CEO... and while a fitness program for bloated SAP business functionality while transitioning to Hana is highly desirable - the challenge of course remains - what is the relevant functionality. SAP needs to come up with a visible process on what is fat and what is muscle it wants to keep.

    And lastly Plattner shared he is deeply hurt, that Hana is perceived as for SAP applications only, stating that over 60% of the use cases are for non SAP apps. Would love to see the use cases. And it's great that Hana is open - Plattner even invited Oracle to run on Hana (!) - but SAP needs to fight the perception from his own track record. SAP inhouse developed technology (like Hana) has never been sold, positioned or stood the the test of the market as SAP independent technology. Hard to fight perception, harder if your track record supports the perception.

    And Plattner busted the myth of the Sybase acquisition being all about mobile - it really was about the database expertise and related IP - patented and not. And many have said this before... including Plattner's friend Ellison who understandable sees Sybase as an inferior competitor in the  database market. And Plattner also shared that Microsoft wasn't too happy with the Hana strategy - but they were professionals about it.

    The start-up interest, with 431 start-ups using Hana is no doubt a tremendous success. Just from a sheer mass processing a great feat, no question. But later we learnt that the Hana Venture fund has gone up from 150M to 450M US$ - that certainly has helped. And is from my knowledge an unparalleled greasing of an eco system in 12 months.

    The collage of startup logos working with Hana

    There were two demos in the keynote - one demoed by the very talented students from the Hasso Plattner Institute in Potsdam, Germany - the other by the ubiquitous Sam Yen. All impressive demos, with the 2nd demo having a graspable direct business benefit for a CPG company.

    Next up was Sikka - and true to the other board member keynotes, it was his time to use metaphors around Hana... What is the need for speed in sports for McDermott, the need for enterprise survival using Darwin for Hagemann-Snabe is Mathematics and Design for Sikka. And with the mathematical beauty in flowers we were possibly at the key announcement of the event - in terms of affecting SAP users - immediately.

    So Fiori wasn't too much of a surprise anymore - given the GA of SAPUI5 over the weekend, the press release about Fiori on Wednesday - I was just wondering if it would even be shown in a keynote at all. So SAP is trying to tackle for the nth time the casual user consumption challenge, in the latest flowerful iteration with the help of HTML5 and Google Chrome. Turns out the Self Service UI dubbed the lillypads we saw in January at HR2013 was already an early release of SAPUI5. And with 25 applications Fiori covers more than HCM self service scenarios and goes beyond to general approvals and of course the area where UI counts the most - sales. But HTML 5 is fickle and the browser properties are fighting on the standards - so you best work on one browser, which SAP has done with Chrome. And following Google IO in parallel - that was a good choice. If betting on Chrome only will be an issues with customers - we will see.
    Some things at SAP never chance - the licensing was not clear at release time, you need SAPUI5, Gateway and Business Suite extension licenses. SAP could have made a huge splash by saying - if you pay maintenance - there is no additional license cost for you, sorry it took us so long to get usability right.

    No Hana keynote without hardware and hardware partners and H-P's Bill Veghte made a passionate statement about the new hardware coming from Project Kraken. If you thought Plattner and Sikka were excited, Veghte was on a different level, good for H-P. I want to know what he had for breakfast.

    Sikka also tied together the Lumira announcement - good to see keynote presence for the former Business Objects tools - which seem to slip in the background. In contrast to Fiori - the pricing here was clarified - and is mostly free for a try & buy time - kudos for a cloud age sales strategy.

    And SAP also unveiled a partnership with Adobe, who will bring the Adobe Marketing Cloud to Hana - no specific dates, but a good partnership as it helps SAP out in the weaker aspects of online marketing, something where Oracle and salesforce have invested heavily since a few years.



    And being day 3 - there was also time to work the ecosystem, with the winners of the startup challenge being announced, all run very close to (SAP) home on their automation content - two had HCM scenarios, one was in the finance area. Obviously as a startup working on Hana - it makes sense to see any exit options in regards of SAP's core business.

    Lastly we were off to the healthcare aspect and while I like the philanthropic aspect - I fail to buy in the tangible business benefit. One demo was about analyzing tumors in real time - but luckily tumors don't change real time. Simulation would be a scenario I get. Let me know what I am missing here.

    The McGeever delivery

    Well, it was you can say almost tradition now for SuiteWorld, blacked out for attendees via webcast at the beinning. McGeever started with a James Bond skit, and presented in tuxedo. It was a solid day 3 keynote addressing services, upgrades and partners. Upgrading while you sleep is a nice marketing catch phrase - but does not really fly for worldwide customers.

    It was good to see how much interest and leverage NetSuite has with partners - and it was a solid day 3 keynote...  but nothing more.

    The marketing battle

    This Day clearly went to SAP. And let's be clear - for NetSuite even to compete is a success. As a technologist it's amazing how a 600 member development team at NetSuite can even pose headaches to a company with 20x+ developers. But size does not make fast, though as this week's GoogleIO shows, Google does not slow down as Box CEO Aaron Levie tweeted this week.

    We didn't get an answer why SAP had the Hana Enterprise Cloud event last week, one week before Sapphire. And leaked the Lumira release. And the SAPUI5 release. And the keynote absent NetWeaver 7.4 GA (something that will affect many more attendees short term than Hana). It all came together with Sikka's part of the Day 3 keynote. But it asked a lot of questions early that were answered late - but SAP can command an attention span of over a week - so this seemed to work. But love to chat on this with Jonathan Becher - is this then new PR strategy for events ahead? Not textbook - but wonder how it compares in marketing metrics of quotes, mentions, hits etc.

    MyPOV

    A conventional Day 3 for NetSuite,. one that as more product centric observer you would skip, as a customer certainly not.

    For SAP it all came together. One week even changed and clarified some things around Hana - a proof how fast the company is  moving its thinking and messaging around this product. Keynote strategy was certainly best for least - as only Sikka's keynote showed and presented and closed the loop on many earlier announcements of the week and previous week.

    Lots to digest for both companies, a lot of details to be hashed out in the next weeks. The towering top takeaway by vendor - NetSuite has an aging UI and needs to do more about it - SAP is a technology company right now - we will see if this is a phase due to re-inventing itself on Hana - or a longer lasting change in the company DNA. 

    0 0

    A massive Sapphire conference is over, with a lot of excitement, energy and announcements. If you want to re-live any of the days, have a look at my posts about Day 1, Day 2 and Day 3 - or for a quick feel - check out the Storifies collected from the most marquee tweets over here





    In case you missed it, I posted a good week before Sapphire about the broad collection of topics I wished SAP would answer at this Sapphire conference... 

    So I am following the structure of this post here


    The Future -Nada

    This was largely asking for the direction of the company in regards of the ambition of serving one billion users by 2016. And while SAP proudly stated the fact, that it has now close to 30 million users in the cloud (detailed math would be welcome) - it wasn't clear to me, how the company wants to reach the one billion. The immediate future is certainly Hana - but I don't think you can ask the little girl to power solutions that will reach every 7th person on planet earth. So what is the plan?

    Why does it matter?
    In the past SAP allocated significant resources to previous company goals - 100000 customers (achieved with the acquisition of Business Objects) and 100 million users - achieved with the acquisition of Sybase and Ariba). One billion as a number will take more than only acquisitions, but significant R&D investment - which will affect current product plans, customers and the ecosystem.


    The Integration Story - The Where is clear, not the How

    This question is targeting around how all the different SAP offerings will come together - from the older homegrown products, to the acquired ones and the newly created ones. And SAP was very clear - though implicitly - that all will come together with the Hana Cloud Platform. The older homegrown products are being ported / made available there (see GA of Business Suite), the acquired products will move there (see SuccessFactors to be expected in August this year) or being built there (see all the new products). From the keynote it was also clear, that Plattner sees this as the historic opportunity to shed product complexity both in schema and going forward. That will be an interesting process and SAP will have to be very careful and very clear what is fat and what is muscle and what can go and what will stay. 

    It is also not clear how you can integrate on Hana - on the schema level, using SQL? Hardly the integration technologies of the 21st century. So kudos for SAP to make the where clear - Hana - a lot of questions remain how it will occur for all the products involved. 

    Why does it matter? 
    SAP is the #1 enterprise applications vendor with a tremendous install based. Any future strategy will have to find paths for that install base to move along - at some point. In the past SAP has failed addressing this (see the original post here) and it's not clear, how the Hana Cloud Platform will get there either. 


    Cloud - We have SAP's answer 

    SAP certainly answered this one. It's the Hana Enterprise Cloud, as announced a week before Sapphire. And its the SAP view and definition of the cloud, one that is by it's RAM nature not elastic (or one way elastic as we learnt from Plattner in his keynote, more in a future post), has its own views on multitenancy (only for small companies), does not need virtualization (again relevant for small companies) and requires you to bring your own license. 

    On the positive side it was good to see Plattner and Sikka argue with cloud benefits - mainly in regards of direct ramp-up of a solution, the speed gained from having SAP procure the hardware, the administration, managing of data centers and software updated - already at the Hana Enterprise Cloud launch. 

    But to be fair, SAP has the right and weight to define their view of the cloud and get it  imposed onto its customers and into its ecosystem. Many may not like it, not agree and warn, but SAP has the right to choose its path to the cloud. Many follow-up questions remain.

    As a side thought: Did SAP maybe pull a marketing stunt? Obviously SAP had to do a lot of proof of concept work, to show Hana prospects that Hana really works. And given the hardware price tag, potential customers surely asked SAP to run those projects. And SAP logically wanted to be cost effective with larger machines. Could this be the start or even base of the Hana Enterprise Platform? It's plausible - the question is - how much cloud functionality did SAP add to Hana Enterprise. We will know and see soon if this was the culmination of a long planned development effort - or a very quick marketing dress-up of hardware and services that were used already for the necessary proof of concepts... 

    Why does it matter? 
    The dynamic provisioning of computing resources on large scale, with mission critical service levels and availability, is the upcoming standard for enterprise computing. As SAP states, even in the SAP cloud flavor, significant TCO reduction have manifested themselves. The capability to manage TCO on a very fluid and dynamic system and usage landscape, will determine the winners for the enterprise applications in the cloud age - and SAP's strategy to move to the cloud will decide the future success of the company in the SaaS space.


    Hana - Some Yes, Some Not

    While many participants and observers wondered, if there was a single session at Sapphire that did not mention or use the ubiquitous Hana - it was clear that SAP trusts Hana to be good enough for the GA of Business Suite. While news reports out there already speak about a rushed GA - it's really SAP's business if Hana is ready for GA - or not. It's the analyst, press and bloggers job now, to find out if this is a real GA as in past GA terms that SAP has used - or if it's a marketing, faux GA that still limits availability and functionality. What concerns me that there are only 4 customers that are live and speak to the public, one of them is SAP. I remember SAP asking for validation by around 100 or so customers before putting a product in GA. But again, this is SAP's business. 

    At this point I cannot discern what all of core and vertical functionality is GA - or even in the process being ported there. I trust SAP that the majority is - but the company should be very clear and should have addressed it in the many keynotes and press conferences (5 total). There was also no high level explanation, how SAP did this major engineering feat. Again I question why this was not addressed openly. And of course the question on the good old SD benchmark remains. SAP missed the chance to explain the void here and why e.g. other benchmarks are better for the Hana age. 

    Why does it matter...
    ... well you know, otherwise start reading about Hana...


    Mobile - Some Yes, Most Not

    While mobile was prominently featured, SAP did not made clear statements as desired in this area. If you can move the Business Suite to Hana - why not move the screens to e.g. tablets? It may not make sense - as with the Business Suite - but when you labor with order of magnitude on the one side - why are you that off on mobile? 

    If my memory does not fool me - both the McDermott and Hagemann-Snabe demos were void of anything mobile. Plattner said mobile first at one point - but it was part of the lecture, not product part of his keynote.. And yes Fiori runs on mobiles and tablets - but what about the 100+ apps shown 2 years ago? How does SAP plan to mobilize content. It's ok to say - we need to sort out some technology first and build then - the customer base will cherish that as part of the good old SAP, the one that delivered like clockwork in the past void any marketing gimmicks. 

    The danger is, that mobile will be seen more and more as a MDM and security play. And while it's understood Sybase has a franchise here with Afaria and SAP has capabilities, as mentioned in my Day 1 post - the partnership with Mocana shows that SAP underi nvested in Sybase / Afaria R&D. Apps in-built security should not have to be a partner opportunity afterthought for an enterprise application vendor... and while I think the choice of the MDM cloud to run on Amanzon's AWS is the right one - it begs the question why the Hana Enterprise Cloud was not used. But this new capability was absent in the keynotes. An opportunity wasted since everyone in the audience had a tablet and smartphone and most of them access critical enterprise information out of SAP systems from it. The importance to SAP is clear - the investment is prioritized somewhere else.... and last not least - affordability remains a topic. Kudos to get it right for the MDM cloud. 

    Why does it matter... 
    ... did you see Google going from 400M to 900M Android applications in less than a year? There you have an enterprise reaching close to 1 billion.. with a single product.


    Social - Nada

    And while McDermott talked about social is the new dial tone - not only Dennis Howlett (@dahowlett)  didn't fully grasp that - it was notably absent from the keynotes. Yes SAP Jam was important to get to the close to 30M crowd users - but that was the most mentioned social product fact. 

    But maybe SAP is cooking on something in the quiet - but from this Sapphire social does not seem to be a first class citizen at SAP. It's fine for SAP to say for now investment goes somewhere else - like mentioned above with mobile - but then there should be a road map when social will catch up. Better in my view - be an integral part of the Hana platform going forward. As seen by the Adobe Hana partnership. If there would be more to it - SAP and Adobe could have said it. 

    But again maybe SAP saved some announcements for later here...  Right now - unfortunately - most users of a SAP product may pick up the (social) phone - but will find no dial tone... 

    Why does it matter... 
    The trend is there, the competition in California (Oracle, salesforce) is investing heavy - SAP should not be left standing in the rain when the social downpour will open its floodgates. 


    Big Data - Coexistence

    SAP made it clear, that Hana will be able to access Hadoop data, and with that took the position of the old datawarehouse vendors, the position of co-existence. As such SAP will risk (and is in good company with Terradata and SAS) to miss out on the Hadoop revolution that we are witnessing. 

    But different to the established data warehouse vendors, SAP does not have to worry about a legacy licensing business, it is re-inventing and re-creating all critical new code right now. SAP's timid approach is determined by the medium of storage for Hana - very expensive RAM. 

    And as we saw, Plattner even was compelled to show, how SAP can bring OLTP information into RAM at lower cost, thanks to selective column usage and compression. But the SAP execs would not dare to start the argument in regards of the unbelievable amounts of data becoming accessible through the Hadoop technology. 

    Again SAP may have different investment priorities, which I can even understand and support, but staying in the co-existence camp has definitively huge risks. Not investing in social and combining OLTP and Hadoop information makes SAP vulnerable for better and cheaper business insights provided by the new players that fully set on the Hadoop card.

    Why does it matter... 
    We live in a breakthrough age of computing. For the first time ever enterprises can store all the information they want and need in an affordable way, analyse it and ask even questions they did not know at the time of the information storage. 


    Line of Business - Most answered

    With the GA of the Business Suite on Hana - the direction is clear. The scope available on Hana needs some clarification though. And with the press conference statements, that Ariba (no date) and Successfactors (August 2013) will run on the Hana Enterprise Cloud, the integration problem I pointed out before Sapphire are addressed - for the short time being. 

    SAP needs to understand, that while moving the acquired products to the new Hana platform is a strategic step - it still does not integrate them. Which brings us back to the question on the integration story - mentioned above. 

    SAP used to ridicule Oracle by saying that the DBMS vendor things all is integrated once it is in the same schema. Well SAP should not think all is integrated when all the data is in column stores either. It's not clear today, what the integration mechanism for e.g. SuccessFactors with the rest of the Business Suite will be. Especially if you consider cloud integration questions. In that sense - as great and bold as it is - the availability of SuccessFactors on Hana for instance, will increase the number of necessary integration scenarios that SAP will have to support. 

    Why does it matter... 
    This is where the bread and butter for SAP and customers is made. It needs to be addressed to make the most of the Ariba and SuccessFactors acquisitions and solidify the new joint functionality realm against any competitive poachers from the outside.


    The maintenance saga - Nada

    The keynotes were void in regards of this sore topic. And while I understand the reluctance to address the issue in a keynote - and Henning-Snabe even stated that SAP sees no problem here - SAP missed the opportunity to openly speak to customers. 

    Worse - the maintenance topic had more oil poured on the fire with the (great) new Fiori applications. Not only was SAP as traditionally not clear on the pricing (a troica of Fiori, Gateway and Business Suite Extensions) - but it's also disappointing maintenance paying customers that the latest - and hopefully successful attempt - to get the eyeballs of the casual user - will not be part of the maintenance benefits, but an extra charge. 

    Why does it matter ... 
    Well speak to any SAP customer if you don't know, but don't be surprised to be yelled at.


    Vision & Thought Leadership - Nothing new

    There wasn't much if anything at all in regards of how a 21st century company is run. Plattner did a good job at summarizing what SAP customers want today - but the question is what do they need tomorrow or more specific in 2020 remained open. 

    SAP needs to stop relying on Plattner for vision. Let's not that Plattner was (and is) the technologist under the original 5 founders. It can't be up to him to create the 21st century vision for business applications. It does not come from McDermott, who is too consumed in short term sales priorities and the marketing message du jour. And it will not come from Sikka - too much a technologist himself, like Plattner. So it falls broadly on Hagemann-Snabe shoulders - I am not sure, if he will be the one to lead SAP on the topic, as he for too long has been in charge of Business Suite and all of development not to try a stab at this. 

    But never say never. If you think through the Darwin talk for business automation processes - what does that mean. And it looks to me that SAP is trying to bootstrap the problem with the Hana start-ups. If I caught it right that the Hana venture fund is now close to 405M - then this would be the broadest stimulation of an ecosystem ever. 

    But also the confirmation, that the SAP leadership thinks the next generation thinking for enterprise applications will not come from SAP - but from small start-ups. Only this has not happened since a very long time, SAP itself is the living proof of that. The last company to challenge the establishment, Siebel, ran out of though leadership runway before Oracle acquired it. That makes me skeptical to the start-up as the outlet for next generation thinking. And it's definitively sad, that SAP has delegated the challenge of its future and not taken it head on.


    MyPOV

    A very interesting Sapphire, much more to write about left. Some questions raised before the conference here got addressed, many not. What is clear is, that SAP is re-inventing itself around Hana. And the re-invention is in full progress, though in an early stage, maybe stage 1 of 4 or 5 total stage. 

    SAP may not like to hear it - but it behaves like and it is a technology company right now, which like it or not, SAP has to be, as it lays the foundation for its future in enterprise automation. As with any process in early stages, a lot of questions need to be answered, remain open and new ones will pop up soon. 

    It will be good for SAP to address them, the sooner the better. Its up to prospects, customers, partners, competitors, analyst, press and bloggers to keep probing. 



    Note: This is a post from the fences which means - I have not attended Sapphire, wasn't briefed in person by anybody from SAP - but used the webcast, press releases, social media to make up my mind here. Feel free to point out what's wrong or missing in the comments section!

    0 0

    During last week's Sapphire conference, Hasso Plattner took the opportunity to address a number of myths about HANA - but also clarified the data load of HANA to the audience. 




    How does data get into HANA

    Well first data gets transferred from the source systems into a column store - to the HANA format (I will call this - lacking a name from SAP right now - the HANA store) - which gets a roughly 5-10x compression - and the storage medium is whatever the storage vendors give us (Plattner) - disk or SSD. 


    Hasso Plattner during the Sapphire 2013 keynote

    Next the most frequently used tables get loaded into memory - Plattner said like the other ones [he meant vendors] - it's not clear what determines the frequency at first time load into HANA - but that's no rocket science to get done.

    Slide from keynote

    Once working in HANA starts, the system loads (and that was news to me) dynamically the needed columns into memory (again from disk or SSD permanent storage). So far I thought and misunderstood all for the application running on top of HANA needed to be loaded to RAM. This begs now a whole set of performance questions. 

    So only the columns that are needed are in RAM as Plattner emphatically stated.


    Slide from keynote

    A column can only be completely in memory - or not at all - there are no caching, intermediate state for columns. As Platner said the algorithm is pretty primitive- if you are not used - you are not in memory

     In the below slide - the red ones [columns] never made it - said Plattner:


    Slide from keynote

    Plattner made the point, that this is the difference to a row store - and with SAP's order line having over 500+ fields, he claimed that you cannot achieve a similar compression with a row store. And fields that are not used - are not even in the HANA format store on disk / HDD.


    Question remaining open

    The slides above state that columns stay in memory till the system is re-started or the columns are purged. Plattner did not elaborate on the purging mechanism, which would free up memory, that would be available for loading further data as needed, from the HANA store. 

    Plattner was also very clear, that the state of a HANA column is binary - either its in memory or it's in the HANA store. But then the slide above states that during the 2nd request [of the same data - my assumption and edition] - the data is near 100% in memory. These two statements contradict each other - unless the 2nd query is chasing the very first one, that's loading data from the HANA store to memory - but that would be a border situation. 

    Both questions are important - because they would point to some sort of elasticity of the RAM usage by HANA.


    Why elasticity matters

    Well for starters it's one of the key definitions for cloud - based on NIST. It really matters as it allows the scale able provisioning of computing power - both for ramp up and ramp down. And with that it determines the TCO of a cloud infrastructure. An un-elastic cloud system - as such we have to regard HANA at this point, will be very expensive to operate. Add to that, that HANA by design needs to run in the most expensive storage medium out there, RAM - it raises TCO even further. 

    Normally the opposite should be the case - the more expensive the resource, the more important it's efficient utilization is, which in a cloud infrastructure is driven by elasticity. 


    The AWS argument

    Plattner at an early point in the keynote said, that for the ones who believe HANA cannot be elastic, look at HANA One, which runs on AWS and it's elastic there. True, but due to the AWS infrastructure, that manages the HANA One AMI. It's nothing in the SAP code that makes HANA One elastic. And hence the concerns remain, as there is no diffusion of the concern on how elastic HANA is - by the mention of HANA One.


    MyPOV

    With Plattner slowly lifting the kimono a little more - it's clear that HANA is (benevolently) somewhat elastic - since it will only load into RAM what really matters - only the used and needed columns. But the purge mechanism isn't clear. And even if its inner working gets clarified, it's really more a gigantic cache we are talking about - not an elastic cloud product. 

    But that's what I know so far, and SAP is a company with many smart engineers, I am sure they know this and can address this and make HANA much more elastic than what it is and / or looks like - today.

    P.S.Since I got up on @JBecher's official guidline that HANA is all caps - it's all caps from this post going forward - forgive the typing like Hana - earlier.] 


    0 0

    This  morning saw the confirmation of the departure of Lars Dalgaard from SAP. There have been rumors and excuses about his absence (death in family, other family health issues) - now it's official. This re-org on SAP's board is easily the largest change since Leo Apoteker's departure a few years ago.



    Why now

    Well, we will never really know till some time down the road - but there are many speculative options... here are the most likely ones from what I see today
    • SAP's cloud pace - though they claim it's fast - can still accelerate
    • Not smart to put the enterprise future into only one execs hands
    • Dalgaard wasn't the right man for the job
    • [Speculation!!!] Maybe there wasn't enough enthusiasm for the HANA cloud strategy?
    Let's assume going forward for now it's all about putting SAP's future on more shoulders. 

    What's changing

    SAP is giving the reins of development over to Sikka, who never formally was in charge of development (see here - per website today - and screenshot taken today) - but was running the technology foundation since the departure of former Business Objects technical leader Herve Couturier

    In it's typical Ying vs Yang SAP also appointed Bernd Leukert to the Global Managing Board - the operative body under the executive board. Leukert has been in charge of all on premise, vertical products and BusinessOne and will be the 2nd highest product development executive going forward. This will reduce also the concerns of the still massive Walldorf based developer community to be run from abroad, which is one of the key fears in Walldorf, as manifested in the Agassi era. As well as SAP's large on premise customer base. 

    The sales responsibility will go to the ex Ariba CEO, Bob Calderoni, who is hopefully a better sales professional than keynote presenter - granted it was tough to follow the sports show of Bill McDermott at Sapphire. But it also shows that at the point of time of Sapphire the re-org wasn't known or planned - as SAP would have given their new cloud sales leader a better and more prominent stage.

    The cloud operations part goes to the longest tenured board member, Gert Oswald - how ends up always to get some pieces when major re-orgs happen at SAP. The former SuccessFactors team can start re-hearsing for Gert's famous 5 minute meetings. And the consulting side joins consulting with Rob Ensslin - a logical functional consolidation.

    The potential looser is Hagemann-Snabe - who formally is responsible of product development - e.g. Leukert reported to him - and he is now more removed from day to day operations - which are clearly with Sikka now. This may start some speculation of the co-CEO structure going forward. 

    On a side note - the just hired head of HR - Luisa Delgado - is leaving end of the quarter, after Angelica Dammann the 2nd women board member to stay shorter than anyone would hope. As usual Werner Brand takes over the HR and Work Director functions. Nicely buried in all the other changes - SAP will need to get it's diversity story right at executive level soon. 

    Going forward

    The experiment of concentrating all cloud DNA has failed. Both on a general company level, and previously on a development level (remember John Wookey, and Peter Lorenz). Now it looks like cloud is a too big topic for a single individual at SAP - and the cloud related organizational functions are now separated across broader functional lines. And while it's a sign that SAP puts a team focus on the cloud efforts now - it will also mean significant ripples in the functional teams - between the new cloud teams joining the old functional teams. Close attention will have to be paid internally and externally on how much cloud talent will stay and keep getting SAP paychecks. 

    Last week I speculated who would lead SAP to the 21st century, next generation applications. Now we know the responsibility falls on Sikka, and to some extent, for the business application legacy on Leukert.

    MyPOV

    Either the lack of the right executive or the realization, that cloud can only become part of the SAP DNA if it permeates across all functions, have lead to a functional organization of the cloud responsibilities across the SAP management. This is a good move allowing Sikka to be fully in charge of the product future of SAP. It increases the risk for SAP though, as the system of checks and balances put in place with Sikka's hire as CTO in Palo Alto, is now history.

    Let's hope that enough cloud DNA will stay and develop in sales, consulting and operations for the rest of the company to make the turn to a cloud business. And as previously observed - HANA scales and SAP manages to invent the game changing 21st century business applications on the HANA platform.

    And other good takes on this from

    • @dahowlett - All change at SAP - here
    • @ckanarauskas - SAP shakes up development organization - here
    • @twailgum - SAP announces sweeping organizational changes - here

    0 0

    One of the presentations I look most forward to on a yearly level are KPCB's Mary Meeker's 'State of the Internet' presentations. These days she releases them as the 1st speaker of the AllThingsD conference - and thankfully they are available on slideshare immediately afterwards. Somehow Mary manages to always have some interesting trends up the sleeve, that at least I have not seen at all seen or not seen with that relevance... so why not check out what this year's presentation held for the enterprise?



    The Internet is the platform

    Remember the times when professionals were even saying, that in the emerging countries to a certain point and in the third world for sure - there would be internet access problems? Well Google will take care of some of that now - but already today the internet coverage and usage is such that a global enterprise app needs to be not only browser based, but make sure that it understands the nuances of internet access in key markets. And there lies the challenge - though the hope lies on the shoulders of HTML5 - different bandwidth considerations have to be taken care of. 




    BigData getting bigger

    A lot has been written on the data explosion, but it's clear that any enterprise vendor needs to have a bigdata strategy. Too much relevant information is available now that it can be ignored in enterprise processes. And while the externally facing processes have been in the driver seat in the past, bigdata is relevant for internal processes, too. Think of the explosion of picture, voice and video data - this will not stop at the enterprise gates, enterprise apps will enable relevant usage of media for business processes and the resulting data needs to be stored, accessed and analyzed later.





    The sharing economy

    One of the emerging trends is the sharing economy - and it manifests itself in consumer apps like snapchat.com - but when consumers start to share resources, it will not be or long until this trend will be a differentiator for the back-end processes of enterprises. At the end of the day it's the apps at these enterprises that enable the economy, so if sharing starts as a business trend, it will raise all kind of issues towards a more rigid, purchase-only minded business automation app. 





    Mobile First is not a choice but a Must

    No surprise that mobile internet access is booming, but interesting to see inflection points of mobile internet usage ecclipsing all other usage in countries like China and South Korea. And with tablet shipments surpassing desktop and notebook PC shipments, it's key to design for flexible form factors. 




    The unfortunate state of enterprise applications at this point is, that there has not been an enterprise application architected and designed from the ground up for all processes to be mobile first. There will be first-mover gains in this area - if a vendor gets this right.


    The xx-able revolution

    With miniaturization of devices progressing, it's clear that smart devices will permeate form and become wearable, driveable, flyable, scanable etc. It's early days, and enterprises are still figuring out how to eg track high value equipment via RFID, but the whole social and collaborative aspect of business work will change with wearable devices. Why even go to a video conference room, or skype, if your communicator device can be with you all the time. Why spend time on expensive printing, if a simple QR code can suffice? And QR codes solve a problem that isn't solved on most wearable and smart devices, information retrival. The QR code avoids that, brings the user to the wanted information - and avoids distractions as served by a search engine or other means of information retrival. 





    Look at China

    There have been a lot of concerns around a market that is tough to enter, copy cat competitors, IP issues etc have been raised in the past - but Meeker's presentation makes clear that China is a modern market, and innovation really happens there. If China wouldn't be such a huge market - we would see much more of the innovation in the US. 
    Just think of the Chinese innovations presented coming from a small market like e.g. Belgium - they all would be exported and available in the US already. But given some degree of North American xenophobia vs China and the attractiveness of the home market for Chinese startups - we haven't seen much of them yet here. For instance I knew Alibaba, but that Alibaba merchandise shipments have passed the combined Amazon and eBay volumen - was a surprise. Same day delivery, taxi apps, the social site Weibo - are all key trends that any enterprise vendor has to look into. And China will soon pass Europe in its share of the world's GDP - an impressive chart:




    Enterprise Freemium Model?

    We have seen a lot of Freemium apps in the B2C space - but not in the enterprise apps space. Somehow the thinking is - as you pay for the license - you don't get any advertisement / promotion coming your way. But seeing Meeker's slides on the market - why not offer a mild form of advertisement in a enterprise app? Especially for the occasional user of self services apps - the advertisement may not even be noticed - but may lead to a free usage of the enterprise app. It will take some B2C entrepreneurs to experiment here - but I think there is an opportunity here.





    Other notables

    High tech stands on immigrants shoulders, 60% of the Top 25 tech companies were founded by 1st and 2nd generation Americans - and immigration is needed to close the talent gap - as e.g. IBM, Intel, Microsoft, Oracle and Qualcomm have over 10000 openings in the US. With a lot of innovation still coming form the US, e.g. Meeker pointed out that 80+% of in top internet properties come from the US, but 80+% of traffic comes from outside the US - it remains key for enterprise software vendors to recruit talent from across the world. It's cheaper having two Brazilians working for you in e.g. Sunnyvale and getting the Brazilian verison done right, than opening an operation in Brazil. 






    Gold in the Appendix

    Some more interesting findings from an enterprise perspective were in the appendix. E.g. the IBM survey see technology factors rising to the #2 concerns for CEOs, they were #3 in 2008 and 2006, and #4 in 2004. This is a huge opprtunity for enterprise vendors to address and leverage for their future offerings - but they need to keep the reduction of risk and the focus on a positive business impact in mind. 




    Re-Imagination is live and well according to Meeker, but note that most examples come from the B2C area - with the exception of MakerBits 3D printing and eLance and oDesk flexible employment (did you know WaaS is Workforce as a Service?). The question is - where is re-imagination of enterprise apps? 





    What I missed

    I can only immagine what compromise Meeker's must have undergone to take her 90 slide presentation into 60 minutes presentation - but here are a few trends I would have liked to see make the deck:

    • Analytics
      We are seeing more software that predicts what we want to do and see than ever before. If you haven't played e.g. with Google Now - fairly simple analytics apps - take a look.
       
    • Social meets Business
      It may not have been the scope of Meeker's presentation, but the whole CxO discussion and how enterprises have to figure out their social presence and actions was missing.
       
    • Cloud
      And while you can say it's remarkable that it wasn't mentioned and cloud is a given - Meeker only touched the edges, e.g. it's easier and cheaper today to get 1 Million users on your product than eve. But the changes and implications the cloud's elasticity brings to markets would have been something I expected. 



    MyPOV

    As an enterprise vendor today you need to build for the larger, more global internet, going mobile (and tablet) first, source your talent globally, have a big data architecture and think through your China strategy. Make sure your user interfaces are highly transportable and independent of form factors, as we do not know where the xx-able revolution will lead us for the consumption of enterprise services. 

    And look at the consumer space, where newer trends like project based work, sharing resources etc will soon have an impact on the future enterprise apps. 




    0 0

    My recent post on how payroll matters again both for vendors and practitioners got a lot of uptake and even discussion - which was much more than I expected on this supposedly boring subject of enterprise automation.  




    The Disconnect of HR 

    Many posts and articles have been written on how the HR function is somehow at a disconnect with the rest of the business. A perennial leitmotiv has been the question on how HR affects the business in a positive way - versus just trying to avoid that managers or the company do not get sued. So it all comes back to creating more value out of the HR function for the rest of the enterprise.


    Value & Connect

    So what is a good vehicle for the HR function to connect with the rest of the enterprise and demonstrate value? The classic path has been to implement yet another talent management function. And the thinking seems plausible at first - as employees are an enterprises key asset and managing their talent can move the needle in the right direction significantly. 

    But I don't want to bore you again with attrition, flight risk etc. Enough written and said about that. The problem with the talent management implementations is, that they require the enterprise (like with any other enterprise wide automation roll out) to adopt a certain way of doing things around talent management. And that's what professionals often do not like, so significant change management is involved in a successful talent management roll out. And the enterprises who get this right - do well - the rest - well never mind, another HR technology project with a questionable return of investment.

    The other aspect - is the connect factor. It's impossible to connect, if you do not have any chances to interact. And the problem with talent management and the connect aspect is, that the talent management functions are sporadic and not of enough frequency to create the connection. 

    Recruitment happens seldom for the single manager, and though sometimes in larger amounts, not on a regular level. Compensation gets usually manged yearly and with the current economic downturn, unfortunately even less frequently for many enterprises. Performance management should be an ongoing discipline, but if you are honest 99% of enterprises keep this to the performance review intervals. And e-learning is great, but again sporadic, and mostly compliance triggered - so no frequency and the business questions the value. Last but not least succession management - which is soo strategic for most companies, that they ... totally neglect it. To be fair - while practicing good performance management is hard, say like running a 25 minute 5k, practicing succession management is more like a 4 hour marathon achievement. For the whole enterprise. So a lot of discipline, no couch potatoe to 5k tricks - but a lot of commitment - and professionals are too often too busy to even start practicing.


    The paycheck has frequency

    If you think about it - the most frequent thing that reaches the enterprises coming from the HR department is... the paycheck. Ironically business automation has made it disappear, and in my opinion thus aiding the disconnect and value perception deficit of HR. Most employees today only look for their paycheck when something is wrong or they need it for a credit event in their private lives. The culprit is the existing payroll system -- which is so efficient - it doesn't even bother to present its output to the receiving end, the payee. How many enterprises today push employees to see their paycheck once they have been generated - or even notify them? From my experience very few.

    The main reason is - there is little value in the paycheck today. It just does what its support to do and since the gold standard of payroll is to just run, the paycheck needs to be accurate in its boringness. Only if something is missing or wrong it becomes interesting...

    So if HR departments want to connect again - they need to think of ways to make the paycheck more... interesting.


    Interesting paychecks - from payroll perspective

    In the previous post on the subject I provided some suggestions on how the paycheck and payroll should be re-thought - just with the means of a payroll perspective. To spare you the click - here is the list of suggestions again:
      • A Payroll 2.0 product should put away with the traditional pay-run  While a sacrosanct ceremony for most payroll managers, there is no reason to keep this practice. Why not let business managers start, run and simulate a payroll? Or push it further and let the employee initiate it and see what his next paycheck will look like.
      • A next generation payroll system should also allow micro payments and payouts. Why not allow an employee to be paid weekly vs bi-weekly vs monthly - or even more employee oriented -on demand? It will certainly make the compliance side more complex - but the architecture of a next generation payroll system should not be the limitation.
      • And while there has been  made a lot of noise around Total Compensation Management, it has only happened on a very high level for employee benefits - both monetary and non monetary. We are far away for an employee to e.g. determine when his take home pay will achieve a certain amount. 
      • Equally next generation payroll systems should support managers in process of scheduling workers. It will certainly help a shift manager to call in employees for extra weekend work if he can tell them how that extra work will affect the take home pay at the end of the month. Likewise payroll data is seldom used in shift planning and workforce planning applications, it usually stops with basic pay and over time pay
      • And when moving payroll to the cloud, the whole electronic banking process should be enabled. The employees should be able to determine bank transfers, split paychecks if needed (think of legal reasons like alimony) and pool paychecks from multiple employers. Or just be able to send or produce the latest payslip for a credit event.
      • Finally we should see 21st century compliance integration, why move data to paper if you can communicate with a government cloud, e-file returns etc. Features like this will reduce compliance costs and with that make the new products more attractive to enterprises.
      As you can see - plenty of ideas to innovate around the paycheck from a pure payroll perspective - but what about beyond....


      Interesting paychecks - from a talent management perspective

      ... what could the paycheck do - when presented on regular and consistent level - could it even help to create value beyond the payroll function, all the way to talent management? I think so - let's take a look:
      • Recruitment
        Why not present open headcount and requisitions on the paycheck - for the department or division of the employee. Yes - we know, employees could check the internal job boards - but seriously - how many employees do that on a regular level.
        The paycheck is a perfect vehicle to tap into the employee's network for open positions, as the glance form the payroll information to open jobs is ... sub second. And you are thinking money, so any rewards for referrals - the mind does not have to wander far.. Needless to say, an employee may also, maybe motivated by the current salary, look for a better internal job (vs an external one). Any analytically inclined brain will now go into overdrive...
         
      • eLearning
        We all know that people are driven by rewards, so why not congratulate them again for their successful conclusion of a training course on a paycheck?  It won't hurt. Likewise - use the paycheck to remind them of upcoming, related, relevant training opportunities.
        Lastly - why not add some gamification to the whole process and reward employees for keeping certification and compliance up... or being in the top 20% to pass or the top 10% to take the course, here is your infamous 10$ Starbucks card code.
         
      • Performance Management
        Let's be conservative - let's only announce how many weeks are left till the next performance review. Just a friendly heads up. Or go beyond and remind a manager how many employees have done their self evaluation already, how did they rate the manager etc. And again - gamification options a plenty - shown directly in the employee's paycheck.
         
      • Compensation Management
        This one is more tricky, but closer to home for payroll. As mentioned in the above section - why not offer the calculation model for the savings of the next big purchase of the employee? Why not show how the healthcare plan fares value wise for the employee. Or what the other benefits are in terms of value for the employee.
        And the paycheck is a treasure chest when it comes to pay for performance - if you want to see how you are doing in regards of achieving your bonus, a paycheck with the usually (for the US) bi monthly cycle - is a good practice to remind employees how they are doing and equally to get their attention on the subject of performance driven pay.
         
      • Succession ManagementWell this one is the hardest. Not sure if you need to be reminded of not having done your succession planning. But you could reverse it - and if part of any performance plan, the paycheck could project what the manager will miss if he doesn't handle this delicate subject.
        Equally it will be good to see for executives, how recent promotions and exits have affected the succession chart. There will always be work - and again the fortnightly (in the US) nature of a paycheck - is a good practice for an executive to see how his team is doing in succession management - not just a level down, but throughout the executives whole management responsibility. 


      Architecture matters - always

      If you want to achieve some of the interactive paycheck scenarios above - you need a different payroll engine than the ones that powered enterprises in the 20th century. Take for instance the scenario, where an employees sees that he can take an eLearning course now - and get a rewarded for being in the first 10% of employees completing it. The employee expectation would be that the paycheck would be immediately re-calculated once he has taken the course. 

      And why not allow for that? This raises of courses some compliance and statutory concerns, but also some architecture implications - a paycheck needs to be available all the time. Itleaves the shackles of a report only past and becomes a interactive tool to show money in your pocket to employees. And how good HCM practices help the very individual bottom line. 

      Communication matters

      The attentive and critical reader may now be (rightfully) saying - this is the construction of a portal, this has been done before. And I would agree, this has been done before - but why has there been no success of getting employees to use these portals? At the end of the day it comes back to value and frequency, as mentioned above - and  here is where the paycheck comes in. In the worst case it has a monthly frequency, in the best case a weekly one - the best practice to deal with talent management issues lies somewhere in between. And given that the paychecks transports information of high interest to the employee - we all want to get paid - there is a natural interest to visit the information, and with that to spring to action. Even more when the action will affect the bottom line on the paycheck. 

      Smart implementations of a paycheck 2.0 will of course step beyond the pure presentation of information - but allow the paycheck recipient to action on any talent management (and paycheck need) right from its electronic presentation. And instantly allow to see how the talent management action affect the net on the pay check. 


      MyPOV

      A smart implementation of the next generation paycheck should be a good strategy to take an enterprise to the next level of HCM practices, thus reconnecting the HR department with the line of business and notching the value creation conversation in a favorable direction for HR. 

      Before that can happen, it requires vendors to re-invent payroll and create a 21st century payroll system, that no longer is an output generator - but a interactive engine that is capable of integrating all relevant HCM data and actions, down to the single paycheck, the single employee. 

      So next time you look at your (hopefully correctly calculated) paycheck, close your eyes for a moment and dream of what it could do... 


      0 0

      After spending more than 25 years building enterprise applications, and realizing that I will have probably another 25 years to work, I took some time to reflect, on how I want to spend the rest of my working years...

      Short Version

      I am thrilled to change perspective from the vendor to the analyst side and start to analyse, comment, advise on the key trends of enterprise application software, with the fundamental changes happening through the cloud, and there follow the basics of these changes, with covering IaaS and PaaS for Constellation Research. At this point I think the foundation for the next generation enterprise applications needs to get richer - so don't be surprised to see some forays into analytics, BigData, mobile, and social - and actually their incarnation as SaaS. 

      With Constellation Research I am very fortunate to join a great and dynamic analyst firm, with great individuals on board, and where I have colleagues that I can consider friends already. Can't get much better. I am committed to continue blogging here, so no worries. But checkout also my Constellation Research site here

      Longer Version

      Reflection needs some time off, so I was lucky that my last position gave me some good time off, on this beautiful construct British HR calls the garden leave. I used my time in the garden to get some things off the bucket list. Seeing the US Olympic track and fields trials live in Eugene, was something I always wanted to do since watching them on TV as a little boy in Germany and Italy. Running a (very slow) marathon came off the list, too - and it was so much fun I ran another one (faster) and 4 half marathons during that time, too. I would never have thought 3 years ago that I could / would be a runner and enjoy running to the point of a healthy addition. Also took of a great Caribbean sail trip from St. Lucia to Grenada. Spend a lot of time with the kids, helped the USYVL as clinician and IT advisor - and most notably, avoided travel. Done enough of that.





      Looking back the enterprise software industry has been very good to me - right from the start where my first internship got me to test and document SFA software. Lucky me, Kiefer & Veittinger would become the largest European CRM vendor, and having been something like employee #2, I was very lucky to be taken on the ride and help making it a great ride. And I will forever be thankful to Georg Kiefer and Klaus Veittinger for a tremendous level of trust and empowerment, that I have never seen before and later. I learnt a lot, was able to do a lot and most importantly had a lot of fun helping to build a pretty unique company. The approach to generate uniquely configured enterprise software based on standard objects, inherit an information model into this construct and only allow break points (today APIs) to customize the system - has been unparalleled so far.


      And when the time came, first through a partnership and then through step wise investment, Kiefer & Veittinger became part of SAP as their first acqHRired (the term didn't exist then) company, I was blessed with luck again. Not only was I charged to lead the pre-cursor of SAP's CRM (FoCus) - but Hasso Plattner also insisted that I would be part of the product development team - and the team developed the first installments of SAP CRM with Marketing Analytics, on top of SAP BiW (how it was called then). Laying out the 5 year roadmap for SAP CRM was a great learning opportunity and till today gives me a sense of ownership in regards of SAP CRM. And I was equally lucky to later work in the Office of the Chairman, directly for Hasso Plattner and Henning Kagermann on special projects. An amazing leadership duo. The Vorstandsassistent job is a fantastic opportunity to get to know a company inside out.

      I was lucky to work on very interesting projects, combining some of the business consulting skills we used as a differentiator at Kiefer & Veittinger with SAP internal projects. Can't mention too much, but e.g. verticalizing the sales force, getting marketing more consumer company style and more about perception than technology, were two of the projects that became public. But I wanted to build software again and along came Oracle, which wanted me to help build CRM applications.




      Working for Oracle during the dot.com boom was a thrill. I was familiar with Silicon Valley from long stints with SAP's Lab in Foster City and then in Palo Alto - but the pace was even more frenetic. And getting the first version of the Oracle e-Business Suite out of the door in 2000 was a major accomplishment. I am still very proud of the team, that through hard work and very long hours managed to ship the OLAP and BI CRM applications before the respective OLTP apps. And soon after the first PRM product of an enterprise vendor. I learnt a lot from my then managers Mark Barrenchea and later John Wookey. Weekly meetings with Larry Ellison were a unique, challenging and exhilarating experience. I will be forever thankful for all the help, guidance, support directly and behind the scenes by Judy Sim, Charles Philips, Sergio Giacoletto and Sonny Singh. Getting a global sales force to do what is not intuitive - is a huge and fun challenge. But I wanted to build software again and along came Fair Isaac, now FICO - looking for a head of products.




      Working for FICO (then Fair Isaac) was an amazing experience. I joined in time as the company was overdue to create their new platform and products for enterprise decision management, an analytical area that is still very near and dear to my heart. I am very proud of the team that managed to keep the lights on in many critical customer situations and select a new platform to build the next suite of products on. But the timelines for the new platform weren't realistic to pursue, which I was pretty clear about and that ended my short time with FICO. I am very thankful to my then boss Bernhard Nann for a lot of guidance, learning and support. And likewise to a great team where I learnt the most from Carlos Serrano-Morales and had the best operations director ever with Stachia Clancy.



      So I had some time on my hand, was stuck in beautiful San Diego and took my first sabbatical. I was tired of creating, fixing and turning around enterprise software - so when SAP was looking for a Chief Application Architect in Palo Alto, a position where you had to manage no one but could work on interesting architecture projects - that looked like a perfect job to me. And a great job it was - I learnt more in the time at SAP Labs then ever in my career about technology - since for the first time I had the time to research, learn and design things - as my main job. Before anything like this was always time constraint and to a certain point a luxury. I remember very insightful conversations with Ike Nassi, Rainer Brendle, Karl-Heinz Roggenkemper, Kay van de Loo and Larry Cable. And I am very thankful to Frank Samuel for the best crash course on my SAP know-how gap of 9 years. I had the privilege to work on very exciting projects - but saw too many good ideas and concepts not finding uptake in Walldorf. Having worked on both sides of the Atlantic, that was particularly troublesome to me - and I missed building products that would make a difference in a few quarters and not - maybe - in years.


      So along came NorthgateArinso, which was very interesting to me as I could learn beyond CRM not only with a new area of enterprise automation with HCM - but also, that the life as a BPO provider is different to life at an enterprise software vendor. When your weekend update has an issue Monday morning as a BPO provider, you end up loosing money by paying penalties for missed services levels by 8 AM... And a lot of credit goes to the former Arinso team that figured out to make R/3 multi-tenant and enable a powerful and cost effective BPO platform. I was lucky to have a very strong team of executives working for me, with Muhi Majzoub taking care of the UK and leading it to the start of a new ResourceLink product, with Eric Delafortrie to show me the ropes around HCM and making the euHReka product a success, two gifted managers with Christine Morris-Jones and Sam Xydias, who ensured a 0 escalation period from down  under and Tony Whitehead who managed a business unit successfully on the very challenged ProIV product.   It was also my first experience working with private equity and I am thankful to Dhruv Parekh as a great confidante in these terms. But NorthgateArinso wanted to de-emphasize products for the benefit of services, a risk which was there all along - so I found myself in the garden.

      I know I mentioned some former colleagues here - and I know I have missed tons of great, challenging and inspiring colleagues. Management by wandering around is something I love to do and I think is essential for a product team's success. Software is build by people and you need to know them, give them time and attention, listen and learn first and foremost... so to the many unnamed former colleagues I have had in Mannheim, Bangalore, Reading, Boston, Foster City, San Mateo, Palo Alto, Redwood Shores, Herndon, Los Angeles, Geneva, Munich, Milan, Paris, Stockholm, Dubai, Singapore, San Diego, Atlanta, San Ramon, San Jose, Irvine, Brussels, Bristol, Peterborough, Hemel Hemstead, Manila, Adelaide, Sydney (and sorry to the locations I forgot) - I have not forgotten you and the many things I learnt from you. 

      Through my garden leave and sabbatical I realized, that maybe I don't want to go back at fixing, creating and making enterprise software successful again. There were good and interesting opportunities, but somehow I could not muster the excitement these opportunities would deserve. I knew Ray Wang from our time at Oracle and he has been coaxing me towards analyst work since a long time. I always enjoyed conversations, meetings and briefings with analysts. 

      All my contacts in the analyst world I talked to told me, that the toughest thing for an analyst is all the writing. And so I put myself through the a post a day routine - and you see the results of that on this blog. And while I am far away the writer I would like to be - I started to enjoy it - even having withdrawal syndrome, if I didn't post something for a day or two... And I am thankful to all the readers of this blog, all the great feedback I have received, encouragement and great criticism.

      From the little I know about it - the analyst world is changing, away from the large firms, the medium size firms have disappeared or have been acquired and I wanted to be part of something, that was changing the way how analyst services are presented and consumed. So Constellation Research looked like a good fit, and I am thrilled to have started there this week, covering the basics of the enterprise software transformation that we are witnessing, with IaaS and PaaS, with forays into what makes these successful, SaaS, analytics, social, mobile and BigData.

      So please follow me along the way, I look forward to hear from you and I promise I will give my best and try to make and keep it exciting and fun - as it has been so far!

      0 0
    • 06/05/13--10:48: Why IBM acquired SoftLayer
    • Why IBM bought SoftLayer

      On the morning of June 4th 2013 we learned that IBM announced its intent to acquire SoftLayer– a strategic acquisition for IBM in their questto reach 7B of cloud revenue in 2015. Financial terms were not disclosed – but the street saysthe deal is valued around 2B US$.



      IBM is serious about cloud

      IBM has been much more active with respect to cloud since the spring of 2011 with the debut of “IBM SmartCloud” and activity has further increased in the last 6-12 months. With the endorsement of OpenStackas a standard IBM opted for the standard route of the new IBM, which is to use standards. But the pool of OpenStack vendors has become pretty crowded in the last months – basically leading to the question, how to differentiate between all those vendors endorsing OpenStack.

      At the same time IBM backed up their cloud ambition with the commitment to the 7B US$ goal – with the target year of 2015 coming sooner in sales cycles than one would think. A traditional route through hardware differentiation was not in the books and also not fast enough – so IBM more recently took the route of an acquisition.



      Why SoftLayer makes sense for IBM

      Amongst the potential acquisition targets IBM could choose from, SoftLayer makes a very good fit. Given IBM’s legacy on the hardware side, private cloud and cloud business solutions (analytics, Smarter Cities, etc) and conservative clientele – it needed a cloud player that would both be strong on the private and public cloud. That SoftLayer was building their own machines may sound counter intuitive at first – but if you take in account the recent rumors that IBM may selltheir commodity server business – it starts making sense again. And while IBM has very talented hardware architects, the 8 years of experience SoftLayer has gathered building cloud hardware and vital support technology (IMS) is an asset that Constellation Research is glad IBM will leverage – one way or the other.


      Moreover IBM needed the capability to let customers operate hybrid clouds – may it be to wait to write down existing on premise hardware, may it be for security and compliance reason as some part of their enterprise applications needed to remain on premise. SoftLayer is an excellent choice for this capability.


      Finally SoftLayer has invested into a very robust and performing data center and network infrastructure – proven in point by SoftLayer signing up 60 gaming companies the last two quarters. Gaming is a very demanding space a vendor would only be successful in, when understanding the infrastructure requirements and being able to address them successfully and SoftLayer now has the track record to prove it.  
                    

      IBM and OpenStack – business as usual

      SoftLayer has been very active in the OpenStack Swift community, especially for Swift. SoftLayer does not change IBM's strategy with Openstack at all. In the contrary we would expect IBM to make more contributions to OpenStack in the short term future. On the flip side IBM needs to differentiate in the OpenStack field – so it will be interesting to see how that will be played by IBM in the next quarters.


      SoftLayer and IBM SmartCloud Solutions

      IBM’s SmartCloud solutions now have an even more viable platform to be deployed on –across the different flavors of the cloud – on premise, in the cloud and hybrid. This combination has the potential to bring IBM really back to the place where IBM once was over half a century ago – the essential standard choice as a hardware and services vendor for whole categories of customers.


      For IBM Cloud customers

      This is an exciting announcement and while it maybe a distraction in the short term, it solidifies and validates IBM’s commitment to the cloud. The flexible deployment options that SoftLayer has proven in the marketplace, are a big benefit to customers, so wait and see, how this will develop in the next few months.


      For SoftLayer customers

      Carefully evaluate, if you are in a customer segment where IBM wants to be in business. E.g. as a gaming company make sure you get the commitments from IBM to keep maintaining and investing in SoftLayer. IBM has stated that it plans to invest significantly in SoftLayer’s technology and grow that business and model.

      For Competitors

      This acquisition notches up IBM's capability and makes IBM an even more serious contender. This will definitively be felt by all the hardware vendors dabbling in cloud - e.g. HP and Dell. But it may also give some food for thought to the cloud purists at AWS and Google - as they do not offer anything private and hybrid. It definitively gives IBM a leg up against AWS in the battle for where the enterprise puts their cloud application. 

      For IBM

      It will be key to communicate the next steps. Kudos for sketching out a high level road map in the press meeting – this early in the stage of an acquisition. IBM will need to do more on clarifying the offering and why this acquisition makes IBM more compelling as a cloud vendor in the next months.


      The formation of the new cloud services division makes a lot of sense, to concentrate experts and unique skills or this market segment under one common leadership.


      MyPOV

      A very good move by IBM, other vendors recently claimed to be fully on the cloud – well IBM made clear they are in the game for real and mean to make this a multi-billion business.  The next months will be interesting to see how IBM will integrate SoftLayer and tune their new go to market for the new IBM cloud products and offerings.


      We will know the cloud has completely arrived at IBM, when the company will announce a cloud division – that will encompass not only the services but everything needed for a successful cloud solution – including the hardware and software. Constellation Research does not expect this to be too far out.






      0 0

      This is a busy week of acquisitions - with IBM's Softlayer (my take here) and salesforce's Exact Target already happened, SAP didn't stay on the sidelines with announcing the hybris software acquisition today. Not all transaction values were disclosed this week - but my estimate is, that north of 6B US$ were moved this week in the enterprise software market. 

      Quite a week - and it's only Wednesday.




      hybris Software

      The company is a veteran of the e-commerce age, being founded back in 1997, which for e-commerce timelines is like centuries ago. But hybris has been able to grow by adding functionality way beyond the original e-commerce scope and expanding the enterprise.

      Over time hybris has grown way beyond the original scope of e-commerce - out of the necessity to support the customer base. Customers were working on more channels, well that requires some MDM capability - which suits also to integrate with various ERP back ends.  Customers wanted mobile commerce - well that required hybris to support mobile shopping carts. Multiple channels and systems - here comes order orchestration... and so on.

      hybris has shown long breath and staying power on the topic, no doubt. But it's recent success of close to 90% YoY growth was also aided by reduced competitive pressure from IBM's Sterling Commerce acquisition and even closer to home, Oracle's acquisition of ART technology. That left hybris as the only dedicated e-commerce vendor standing from the 2011 Gartner e-commerce leader's quadrant. A void into which hybris executed very well, so congrats to their management team on that.


      SAP Whitespace

      As stated before - SAP - like their competitors - are under enormous revenue pressure to produce the results expected by the markets. With a significant over-licensing situation - it's getting tougher to find new areas of business automation to sell to customers. And one area that was under-penetrated by SAP since the dot com boom and post the markets pseudo boom phase - has been e-commerce.

      One ironic example of this has been the poster child customer on the call, Grainger, which sells office equipment in the US. Around 1999 already Grainger was the customer to demo for SAP as they had a great vision of selling micro-targeted and priced products and where a showcase for SAP on the back end (back then under the MRO hype mantle). And while they still use the SAP back end, their e-commerce automation had moved to... hybris.

      So SAP has no good, complete electronic interaction and commerce platform for customers, especially consumers. And competitors like Oracle and salesforce are providing the front ends to the customer interactions where SAP provides the back ends  This should make scary late 90ies scenarios pop up in Walldorf, when SAP was in the same position, only it was Siebel that was getting big in the front office.

      From all recent SAP acquisitions - hybris has the most white space from a SAP install base perspective. More SAP customers had Business Objects, Ariba and SuccessFactors than having hybris. The 80 or so common customers are a drop in the ocean for the 240k+ SAP customer pool. So most likely - this is the largest potential for SAP to address post an acquisition - ever.


      Where will hybris go?

      As standard by now - hybris will remain a separate entity - which on paper is always good to preserve culture and dynamics - but eventually unravels at SAP. Unusual for a 2B+ (speculated) acquisition, the hybris CEO does not become part of the board (as did John Schwartz, Lars Dalgaard and Bob Calderoni). Or maybe the co-CEOs thought it's not again time for a top level re-org (my take on the last re-org here).

      This may even accelerate the integration of hybris further, as SAP will have to push looking for cross-sell opportunities into their install base. The big questions is - what appetite does the average SAP account manager have to sell e-commerce solutions - something the veterans got burned on - and more junior members of the salesforce may not see the sweet spot in. But then HANA based products are not directly around the corner either - so I would expect some account managers to do crash courses on how to sell e-commerce.


      Integration matters

      SAP knows by now, that when they acquire another vendor, SAP customers expect an integrated solution. This has been a steep learning curve recently, as Business Object as a BI vendor provided integration tools. Same for Ariba. This alleviate the need for newly created integration options.

      But on the SuccessFactors acquisition it became clear, that the thin type integration SuccessFactor offered, was ok with customers when they bought from a separate company, but this would not work once SAP and SuccessFactors were a combined entity.

      The challenge for SAP' is, that it does not have a viable integration platform. HANA Cloud Platform is too young and faces many other challenges. NetWeaver PI is coming around these quarters - but the main question is - where to integrate to - the Business Suite classic or any brand new HANA based products. 

      On the press call there was even the hint that hybris maybe re-platformed on HANA. But putting hybris on HANA maybe straightforward for hybris - but will create even more integration needs for SAP - as the data to run a multi channel e-commerce system like hybris - will have to come out of the Business Suite. Which technically could run on HANA -- but even in that case data needs to be accessible for hybris, and not only will data flow to hybris, it ill also need to be written back. 

      And I doubt the SAP salesforce will want to limit the cross-sell potential to customers only, that have based their Business Suite on HANA. Which creates additional integration work in regards of getting the relevant ERP content exchanged between hybris and Business Suite in classic deployment.

      Last but not least - hybris announced their integration to SAP - only a short 3 weeks ago at Sapphire - and while it runs at Grainger, Levi's, Phonak etc - it seems to be a pretty new offering. 

      The good news for SAP is - integration needs are so vast - that they may raise the bar for competitors to sell into the SAP install base, but only once SAP integrates well enough.


      The (biggest) missing pieces

      SAP still has a few missing piece for the overall B2B2C strategy - and that's the orchestration of customer relationships and business across channels. This job falls to market segmentation and relationship automation via campaign management. Both functionalitiey exist in the Marketing module of SAP CRM - but they are not up to 21st centrury best practices. 

      Equally SAP lacks functionality for another key component in the consumer space, sentiment analysis. The former Business Object Inxight is doing text well - but cannot process the signals a consumer leaves as an electronic interaction trail. And the partnerhship with NetBase - well iyt is a partnership. Not enough to counter and compete salesforce's Radian and Oracle's Vitrue and Collective Intellect capabilities. 

      So expect SAP to take out the checkbook soon again.

      Implication for SAP customers

      SAP customers that are thinking about their e-commerce presence and automation should pause their efforts and see what SAP will come up with in a reasonable timeframe. hybris capabilities are advanced enough to justify a reasonable timeout.

      Implication for hybris only customers

      You need to get assurance from SAP that you will be supported down the road. I just spoke to a customer of SAP today - who simply got forgotten because of being on a not go forward platform. Extract the concesssions in the next weeks when there is still attention from SAP top management on this matters.
       

      Implication for SAP partners

      This is good news for partners since most likely there will be a lot of time and labor to be spend - at least for a transition period - which will help SI revenues. If Bill McDermott is right, that CEOs want this - then except lucrative budgets. 

      Increase your staff expertise on e-commerce in general and hybris in particular. Polish HANA skills and assess  likeliness of integration going towards HANA. Likewise for NetWeaver PI - it's new - get trained and get experience under your belt.
       

      Implication for SAP

      This is a very good acquisition for SAP, securing revenue potential for years to come and only creating insignificant product overlap questions, unlike the SuccessFactors buy. How many customers are still on the venerable SAP Store these days? I guess they will be happy to move to the hybris offering. But SAP will need to do a better job on communicating and executing the integration plans than recently with SuccessFactors.
       

      Implication for competitors

      Depending on how well SAP will address integration - there is a differently timed remaining window of opportunity. Most likely hybris will slow down with creating new leading functionality - even though no one from SAP and hybris would admit to this - so this creates an opportunity to claim functional leadership with hard to sell away differentiation points.
       

      MyPOV

      This acquisition makes much more sense for SAP than e.g. the SuccessFactors acquisition. Though both will alleviate / have alleviated competitive pressures, the hybris acquisition sees pretty much no functional overlap with existing SAP offerings - so much less explaining to do externally and internally. And  moving hybris to e.g. HANA is a simpler scope than  moving SuccessFactors. I bet if SAP wanted, hybris could run on HANA before SuccessFactors.
      But SAP needs to address the integration roadmap quickly and execute on it. 

      Once more like all things product at SAP these days - it's all about execution.


      0 0

      The other day I went through a rather mundane challenge... I was running Office365 on one machine - but didn't like that machine, returned it and got another one, a new one. Well all I needed to do was re-install Office365, that I need to edit and see revisions of research reports.



      Well - the process started by noon on Thursday, and I was finally done on Friday by close to 10 AM  - close to 24 hours. I don't want to dig into the details - but at some point I started tweeting about my progress and created this Storify if you want to visit some of the highlights you can find them here.




      But what we can learn from this is, that the reality of customer support, even at a deep pocketed company like Microsoft, is ... pretty sad. Maybe I had a bad day, was just unlucky, had agents at the end of their shift etc... but I don't believe that - given the nature of the observations I was able to make.


      Solid Products - less support

      All issues seemed to be around downloading and installing the Office365 install files. I realize they are huge, but not larger than other downloads I do (development environments, legally purchased movies etc). This whole problem could have been avoided with a more solid, fault free download.

      And typical, like many high tech companies - Microsoft tries to not eliminate the problem at the root - but implements workarounds... I was surprised Office comes with repair tools - a quick repair tool and a internet repair tool (which takes longer, not sure why the internet name would make it look like it take longer, but ok). And indeed these tools work, on the original machine one of the repairs helped me to install Office365. But how about instead making sure that the install files are not having issues in the first place? The leaner approach that certainly is.

      Worse - Microsoft bends its own rules. At least the ones from the past when a Windows program needed to un-install again. I have seen plenty of bad and unclean uninstalls - but I never in my 20 or so years of using Windows (ok, dated myself Windows 2.1 was the first Windows I ever used) - had a program that would not un-install itself... but now I had Office from Microsoft themselves - and knew I was in trouble:



      And finally I cannot prove it - there maybe an issues with authentication with Microsoft. Somewhere between Microsoft ID and Office email address there was a snafu. Can't pin point what it was, it surely wasn't solved by deleting browser properties - we tried this three times - but something in the back-end did not work (as I could not log in).



      Confused about Microsoft ID - don't worry Microsoft is, too

      In all three conversations I had with Microsoft 1st level support - they had me try to login with my Microsoft ID and alternatively with my new Office365 email ID. Well turns out - as confirmed by the last support agent who finally fixed my problem - it has always been and is only the Office365 ID you should use to login into Office365. Seems intuitive - but 3 support agents had me try another path. How and why it finally worked - I was not privy to the root cause.


      Support best practices are... hard

      Interesting enough - every of the 10 agents I talked to asked for a callback number in case we would get disconnected. Very good practice - only you need to call back when the line drops. In my case 3 times. So why ask for the number when you do not call back - or your agents do not have the capability to call back?

      And of course you should have a knowledge base and trained support agents. I mentioned the confusion on the Microsoft vs Office365 ID. But we also went down dead ends like switching browsers, deleting cookies etc. At no point I had the impression the agents were working on a cohesive script.

      Moreover you want to make sure your agents know where to transfer calls. All first level support agents knew I had a problem with Office365 - but I got routed twice to the wrong 2nd level support. No idea why. But frustrating for the customer and this costs Microsoft real money.

      It was good to see, that I was a member of four personal, warm transfers, but I had to start from scratch every time. It does not look that the agents are working on one common system - as I had to re-answer same and similar questions again and again.

      But on the downside there is always some signal and signal strength loss with every transfer in the VoIP ether - so two times the 3rd level agent could not hear me anymore... so obviously the signal boosting does not work well enough.

      Lastly - global remote support is not an easy task. But you should be aware of the time zone where you client is in. One 2nd support agent told me that the next level of support is gone - because it was after 10 PM PST, but well it was 9:17 PM PST.

      Needless to say every agent I talked to was friendly, patient, polished and professional.


      Make your support agents life easier

      Only the 9th agent logged a case and I got a service request number. But all other agents beyond 1st level always asked me for a service request number... so make it a practice to log a case and generate a number right away.

      And anyone who came up with these order number format nncnccnn-nnnc-nnnn-nccn-nncnccnnncnn needs to call India and pass 10 order numbers across the line every day... it took me 2-3 minutes to get that monster across the phone line... so make it easier for the support situation.


      Social practices are ... harder

      I wasn't private about the problem anymore - used the Twitter hashtag #OfficeSaga and tweeted every step along the way. Some follow Twitter users re-tweeted. I addressed tweets to @Mircrosoft and @Office. But the social pickup came when it was all done... and typical social networks - a Twitter user replied - even before me (inside joke guess at #HANA speed):



      Since then - nothing, nada, zilch... so when you engage in social relationship management - then you need to monitor earlier, be faster and round the interaction up.


      Biggest Concern

      The agent that managed to help me at the end of the day is working in a team, that is only available 9-5 PST. That is hardly enough coverage for software support in a country like the US. Yes - I had an option to continue to emergency and productions issue support - but that did not seem adequate for an Office install issue - I have been on the other end too many times to do this... but Office buyers and users deserve a better time coverage in my view. Unless Microsoft shows me, that I am the only one calling in in after hours... but empirically I doubt that - since I got the custom due to unexpected call volumes... message right at 8 AM. So plenty of backlog.


      Advice

      If you are in charge of support somewhere and reading this - hope your team does better. If not and you are aware of the weak points - then address them asap. If you think you are stellar - then pick up the phone and try a few warm transfers and see when the nth agent can't hear you anymore.
      ...


      MyPOV

      Maybe it was just bad luck. But it showed some systemic issues in customer support, that I would have thought a well funded, high tech company like Microsoft would solve better. Concerns me about the general support experience out there. Good luck next time you have to call 1-800...

      0 0

      Cornerstone had their yearly user conference Convergence in San Diego from June 6th-8th. The conference was well attended with over 1000 customers, partner and employees coming from all over the world.


      It’s all about - Re-imagine …

      The new marketing tagline of Cornerstone is all around re-imagining the product offering. The area of imagination are centered on two directions – product functionality and user experience. In two keynotes presented by CEO Adam Miller, the company went at length and into detail to highlight the latest improvements in both directions.




      Picture from Twitter
      Miller did a great job at explaining how the Millenials increasing labor force share changes the requests and demands for the enterprise and with that for its HCM and talent management systems. Equally the consumerization of IT requires vendors to upgrade their offerings to take account of easier to use systems, bring your own device (BYOD). And then work is changing itself, becoming more international, location and device independent, location agnostic and multicultural.

      Re-imagine functionality

      Cornerstone wants to re-imagine the functionality behind its key three automation areas – performance, learning and recruitment. Cornerstone has targeted specific new functionalities In each of the three automation areas, to bring them further, and closer to the new workforce’s needs. Not surprisingly there has been special attention on recruitment



      Example Re-imagine Performance - Picture from Twitter

      Re-imagine user experience

      The other direction of the re-imagination project is around the user experience. Cornerstone rightfully realized about a year ago, that their product’s user experience was falling behind and has embarked into a brand new user interface with a much more 21st century tune, you may even say consumer grade user interface.

      The overall acceptance of the new user interface was very positive by the attendees and I think Cornerstone has done a very good job creating a lightweight, easy to use interface experience with good cross platform consistency.



      From ZDNet - June 4th 2013.
      But it’s not only the user experience, the modern workforce also needs more modern collaboration tools, so Cornerstone rebuild the Connect product, consistent to the new user interface. At this point it is a powerful multifunctional feed function, that serves the collaborative usage needs well. The Connect product gets complemented by solid project and task management functionality.

      My main concern on the new user interface paradigm is the scroll intensity and related downsides. The old user experience paradigm of the eyes are faster than the mousecomes to mind – when users need to scroll to get to information – it takes time and in the worst case – they may not even find the information, as they do not start scrolling. But overall the new Cornerstone user interface is a significant increase in usability and can rightfully claim to be consumer grade.


      Public Sector, SMB and force

      Cornerstone has expanded beyond being a pure horizontal vendor – with a focus on public sector and is adding (north American) functionality to make it a strong provider of talent management for this vertical. And while public sector organizations certainly can take help managing talent, it was a surprising choice of a vertical to me.


      Equally Cornerstone has started to offer products for the small and medium size businesses (SMB) – mainly centered around the performance functionality of last year’s acquisition of New Zealand’s Sonar6. Sonar6 was a leader in the gamification trend for enterprise applications and it is good to see some of that DNA not only being preserved for the SMB offering (<500 employees is the cutoff) but that some of the Sonar6 acclaimed performance management functionality like the helicopter view have made it to the mainstream product.


      And back in February Cornerstone released it’s Salesforce based product. Interesting enough the company operated a subsidiary in stealth mode for some time, let it build a product on force.com and sell it to over 70 customers, including Salesforce themselves. It’s a gamble not only on the salesforce ecosystem and partner sales activities, but also on the force.com platform as an alternative to the more Microsoft centric mainstream products.


      Cornerstone Uniquenesses

      The company has a few unique characteristics that are worth mentioning here. At the preceding analyst day it became clear, that the executive team has been working together for a long time, know each other well and is very consistent facing the questions of the analyst community. It’s remarkable that these executives were able to grow along with the company for the last years of very rapid growth, a living proof of hiring talent for growth.


      Moreover Cornerstone is one of the very few enterprise vendors out there with a single platform, with all offerings being built on top of this platform. This not only ensures consistency for end users, but also a higher productivity for development resources. At the same time a single platform company needs to have an attentive eye on the time appropriateness of the platform, but Cornerstone seems to have a good eye on this. The user interface modernization being a good indicoator. Likewise the force.com based product seems to have been an alternative platform validation project, with the nice side effect to yield a separate product for the Salesforce ecosystem.


      Then there is the location of the company, being headquartered in Santa Monica. And while there are more recent successes from Silicon Beach it remains an unusual location for an enterprise software company. But Cornerstone has reached a good critical size and attracts enough talent for their product development teams to allivieate this a concern. It could even be an advantage, as Cornerstone engineers come with a relocation extra price tag, should a competitor try to poach.


      And finally a lot of respect has to go to Adam Miller – who is the driving force on product vision and direction. Few software companies today rely on their CEO to do this – and even fewer CEOs come to mind to have that capability. Even more remarkable is that Adam's background is not in the high tech industry – but in legal and investment banking. Talk about a wide talent profile.


      Positive customers

      We always like the opportunity to formally and informally chat with customers and partners at user conferences. And while the overall atmosphere is always giddy and excited at these events, we heard only very little negative and critical comments. Cornerstone’s customer base is positive on the company and products and is looking forward to get the upcoming Spring 2013 release to their users.




      Picture from Twitter

      The road ahead

      Like Workday, one of the few other single platform HCM vendor in the market today, Cornerstone has a year of execution tasks and challenges ahead of them. Once you have a great product you need to get it to customers and with little localization needs for talent management required – in comparison to core HR and payroll – the worldwide expansion is a key task for Cornerstone. Hiring, training and getting sales people to the first sell in many geographies outside North America will be critical. And then the ramp up of the following value chain with partners, consulting capability and delivery infrastructure to ensure customer viability and satisfaction. The latter has the most obvious need for investment – as the company does not have a data center in the APAC region yet – something that due to international networks latency is pretty much a necessity.


      And while Cornerstone is hitting the right notes in terms of re-imagining their core products it needs to keep investing into them. Across performance, learning and recruitment the latter is probably the lightest in terms of functionality.  And while the weakening of Taleo in the marketplace due to the Oracle acquisition has helped some early customer successes, this situation won’t stay quiet as it is now for too many quarters to come. 

      Not only will the Oracle / Taleo combo get its act together, but we will see also the Workday recruitment module in less than 12 months from now in the marketplace. This will transform the currently close partnership between Cornerstone and Workday into a coopetition at best.


      So it’s time for Cornerstone to invest in distribution, delivery and products. The news of the planned 220M US$ debt offering is a good sign that Cornerstone is serious about the expansion needs and takes advantage of favorable conditions in the capital markets.


      Advice for partners

      Find a role in the upcoming expansion that Cornerstone needs partners to fill. Find a win / win area that you can invest and execute in and that allows you to setup for above average revenue growth in the next 12-18 months.


      Advice for customers

      You are in good hands with Cornerstone for learning and performance. The external enterprise functionality is a very good addition and value add.

      • If you look for recruitment, make sure it’s a good fit.
      • If you are in public sector understand the road map and make sure you get re-assurances for its timely delivery if your critical processes hinge on it.
      • If you are an SMB make sure the gamification nature of the product suits your employee base – and if it does – you have one of the most exciting talent management products at your disposal.
      • If you are on the force.com platform and want to run an integrated talent and sales / service system – Cornerstone is one of the more viable and capable vendors.
      • If you are based in APAC make sure you have sufficient re-assurance for onsite application performance.


      MyPOV

      A good event for Cornerstone with significant product innovation – not only on the business functionality but also the critical platform side. Cornerstone has positioned itself well and earned a very good market position that it now needs to execute to expand and solidify in the next 12-18 months all across the world. Exciting times for the company’s customers, its partners, investors and employees. And when well executed, it will not only re-imagine the products – but also the company. 


      0 0

      Today Oracle announced the availability of Java Enterprise Edition (EE) Version 7. This marks the first Java EE release under complete Oracle stewardship and is an important milestone to assess Oracle's stewardship of Java and its implications for the enterprise.


      No doomsday here

      There was a lot of concern in the Java community on how Oracle would handle the future of Java, that came under its control with the Sun acquisition. The majority of voices were critical, the father of Java, James Gosling, being one of the most critical ones. But criticism has calmed down over the years as Oracle has kept Java open and continued to develop it further - with the similar principle of primus inter pares like Sun did.

      With involvement and contributions from co-opetitors like Google, IBM, SAP, Sybase, Tibco and VMware there certainly is trust in the Java EE community to work with Oracle. And about 9 million Java developers and 18 Java EE compliant application servers give credit to the wide adoptions and relevance of Java.


      Screenshot from Webcast

      Java remains competitive

      With the Oracle investment, Java remains a competitive development language for enterprise applications, but despite the enormous amount of applications and systems running on Java, Java is playing catch up on some of the newer technologies. Take HTML5 support as an example, where we are already in the disillusionment of phase of HTML5 adoption with e.g. Facebook recently abandoning their HTML5 efforts for the sake of native mobile apps. 

      So with HTML5 support now coming to Java EE7, Oracle is playing catch up to a certain point, but the consumer application market seems to have been gone anyway - the question is, will this make Java more attractive for enterprise application development.


      Screenshot from Webcast


      On the core Java side, Oracle added support for JSON 1.0, something that could have been worked around before via library inclusion, but with support inside of the platform, it makes JSON usage easier and more robust. This also makes the creation of HTML5 applications much easier, which are connected through Websockets 1.0 and provide necessary life cycle methods for nature of the many HTML5 applications out there. And making the HTML5 markup easier from Java Server Faces 2.2 is an important productivity step when building HTML5 apps. 

      I guess with the combination of the three above - for a user interface intensive enterprise application like e.g. employee self service on HTML5, you will save up to 10% of development effort and will achieve a much more robust application as a end product.

      And the addition of simplified JMS, CID as a core component model and bean validation on POJOs with improved default resources for JDBC / JPA and concurrency will help developer productivity further. It certainly does not vault Java into the productivity range of some of the popular scripting languages, but that comparison is not fair from the starting premise - Java is a fundamental programming language, not a scripting tool.


      Key enterprise additions

      Oracle addressed some key weaknesses of Java for enterprise processes with EE7. Unfortunately there are a lot of long running processes in the enterprise, and Java was never the ideal choice to automate these - as the Java processes would be unreliable and seldom there when needed. With the the additions of batch applications this is being addressed and I am very curious to hear about success of the first batchlets out there.

      Hand in hand go the extension of the concurrency utility APIs, with the addition of asynchronous capabilities - a new category of applications is now attainable with Java, that it was better not to use Java or before. How far this will get Java applications for multi-threaded concurrent tasks - we will see and hear in the months to come.


      What was missed

      The Java community had some hopes that Oracle would also address some of the caching challenges, but already a few months ago the new JCache was cut out of the release schedule for EE7. Similar the high hopes for more PaaS support have been delayed towards EE8, as known since fall last year.

      Oracle did the right decision to not wait longer, as enough critical substance is in EE7 and the community should not have been kept to wait longer for them.


      After EE7 is before EE8

      It will be key for Oracle to quickly increase the caching capabilities with JCache in EE8, maybe even a major EE7 dot release. A tender spot for Java applications for some time. The cloud / PaaS capabilities will be similarly critical. Though they give the ecosystem of PaaS vendors some more breathing room to differentiate their offerings and lead further - which helps Java overall. 


      Screenshot from Webcast



      Advice for enterprises - purchasers

      Look for EE7 adoption from vendors and first experiences before postulating it in RFPs. When promising and encouraging, do not hesitate to add to your requirements. Many benefits from EE7 will make your enterprise application more attractive to end users and more stable on the backend side, so start the conversation with vendors early.


      Advice for enterprises - builders

      If you haven't - it's time to get your hands dirty with some lab installs and trials. Focus on the highest benefit drivers, probably focus on the batch and concurrency pieces first. And have a look at NetBeans 7.3.1 as well as Glassfish 4.0 


      Advice for ISVs

      If you are only looking at EE8 now, you are behind already, time to catch up. If you haven't moved to HTML5 yet - this is a good evaluation project. Check your defects and support requests in regards of stability needs of background, long running processes. There may be some low hanging fruits there.


      MyPOV

      Oracle has shipped a solid and attractive release with EE7. It now needs to maintain momentum to keep the install base and not loose more hearts and minds to the scripting languages. A roadmap and milestones for EE8 coming out soon - will be the first step. But for now take a deep breath and enjoy Java EE7. 

      P.S. Many thanks to Oracle's Mike Lambert and Claire Dessaux for a pre-launch briefing. 




      0 0

      At this weeks HPDiscover conference in Las Vegas, HP announced a number of interesting new offerings, the one that caught my attention was the nicely crafted acronym of HAVEn, so let's understand what it is - and what it is not. And I hope the movie buffs pardon the name bungling in this post's title, but this one was too tempting. 


      What is HAVEn

      HAVEn is the bundling of a Hadoop based, Autonomy and Vertica encompassing, Enterprise securyity enabled platform that allow for the building of big data apps, lots of them, n to be precise. Got it? And for good measure HAVEn also includes ArcSight - another A if you want. The first HP application to be delivered on HAVEn is HP Operations Management, which is a very good showcase of big data capability for the platform. And is close to HP's home of helping IT, run IT better. 

      From HP Presentation


      The other good news for HP is, that given HP is bundling existing products, everything that has been built in the past using Vertica, Autonomy and ArcSight - will now also work on HAVEn. Only enriched with enterprise security and the availability of a Hadoop Storage system. Which will make any potential move to HAVEn even more attractive for existing applications. 

      Proofpoint for Hadoop

      As with many announcements, not too much specific could be gleaned on how HAVEn really will work, something to sort out in the weeks to come. But the usage of Hadoop is a great proof point for the maturity of this technology - and there are two options how HP may use Hadoop.

      (1) Hadoop as another data source
      This would be the low hanging fruit and a tacit admission that despite all the capability of Vertica and the sophisticated algorithms of Autonomy Idol - there is another key data source to get data from. Like many other BI / big data vendors HP may now admit to a co-existence scenario with Hadoop.

      (2) Hadoop as the platform
      This would be a much more ambitious approach than (1), making Hadoop the operating base and storage for all HAVEn data. And to some point the Autonomy announcement of allowing the Idol engine to run in a Hadoop kernel points in that direction. But it would still see HP operate the Vertica stores for high performance data.

      As mentioned - we will see in the next weeks how HAVEn really will work - but regardless - a key validation of Hadoop as a viable platform and as a technology that has arrived to the enterprise - considering the uptake by HP something like the knighthood for Hadoop.

      Clarification is needed

      It's close to impossible to provide an assessment of HAVEn as a systems, since so little is known about it. HP will need to clarify a lot in the next weeks and I am wondering if HAVEn will be only a loose marketing term of bundled together products - or a true bigdata platform going former.

      From HP Presentation


      My hope for HAVEn is of course for it to be a true platform, that will not only enable HP itself, but also its extensive partner ecosystem to build the n applications on top that are part of the name. But for that to happen we would need the specs for HAVEn, what systems it runs on, how does it authenticate, access systems and data, move data etc - all things that need to be clarified.

      The Services Dilemma

      And like with all things HP this days, we get always reminded of the availability of HP professionals to help customers with these offerings. The irony for HP is, that by integrating the components of HAVEn better, it takes away some of the services revenue. 

      But it's the right strategy since customers will not have too much of patience left to pay for integration of HP acquired products. At some point the integration will be simply expected between synergistic products. And that HP has a synergistic big data play across Autonomy, Arcsight and Vertica has just been proven by the HAVEn announcements.

      Advice for HAVEn component customers

      If you are using Arcsight, Autonomy or Vertica today, keep using them. You probably will end up with HAVEn automatically, as HP integrates these components. Try to understand early what HAVEn is though, so you don't pay for custom services for something that you will get from HP soon and likely for free. 

      Advice for HAVEn prospects

      It's too early to tell what HAVEn will be and even more far away from predicting any market success. But if you are to embark into a big data project, this is one of the more interesting platform announcements in the last quarters. So try to learn more and let HP explain this to you in detail.

      Advice for HP competitors

      If you have no bundling and integration plans for complimentary big data pieces you have acquired, HAVEn is your wake up call. If you are a single big data product player - evaluate partnerships asap - as more integrated big data product announcements are on the horizon.

      Advice for HP

      Provide a lot more information on HAVEn. Explain the role of Hadoop. Put real product integration investment behind this to leverage the 2 + 2 = 5 synergy benefits you can hope for. 

      MyPOV

      HP has an early mover advantage now by bundling the acquired big data pieces in its realm. The questions is, what took HP so long, but that's a consideration of the past. It will be interesting to see how bundles like HAVEn will change the dynamics of the big data market away from many products and many hands to fewer products and less hands to be successful with big data. 
      But for now congrats to HP for a promising launch - execution needs to follow. 

      0 0

      Every technology market goes through different growth phases and at this point I think we are witnessing the beginning of the second phase for the cloud market, in which the number of players increase, mainly by new market entrants. And at the same time competition increases as the combined forecast of the market players exceeds the overall market growth – so there is significant price competition in the market.




      Nothing to worry about in general, this is a normal phase for every technology market as the market potential attracts more players than the market can bear long term, but it’s all for the better as this process challenges the existing leaders, creates new players and sets up the market for stage three – continuous and supported growth.


      Exhibit 1 – IBM wins GSA complaint

      The CIA was looking to put some processes in the cloud already last year, got complaints at the time for Microsoft and AT&T, but ended up selecting Amazon’s AWS in January. 

      Ironically the findings report triggered by the complaint found a cost advantage for the 2nd best bid – IBM – but the CIA felt it wanted to go with proven technology.

      Not surprisingly, IBM complained to the GAO – and recently the complaint was upheld… so the CIA needs to re-tender… and we will stay tuned to what happens.

      What it shows though is that the cloud market is maturing, as this is the first time we see a GAO complaint around the cloud market. Something that happens routinely in other government procurement situations…. Anyone remember the Boeing / Airbus tanker selectionscharmuetzel?


      And simply put – IBM could not let this one slide, too big of an opportunity, with all the benefits of being one of the first cloud providers to the federal government, follow up business etc. – which is a sign that the overall potential of the cloud market is not enough at this point to e.g. make IBM walk away to other large cloud opportunities.


      Exhibit 2 – HP and Red Hat bundle away

      Another sign of entering the 2nd phase of a technology market is, that players partner and / or create bundlings to differentiate their services from the other market players. So it happened last week when HP announcedtheir CloudOS at their HPConverge conference and Red Hat announcedthe Red Hat Cloud Infrastructure at Red Hat Summit.

      In both cases congrats need to go to the respective marketing teams under Marty Homlish and Jacky Yeaney for associating generic terms like OS and infrastructure with their offerings. A dream for any marketer. As the association of the generic term with your brand is the Holy Grail for associative thinking, like e.g. we picked HP because of their Cloud Operating system– and guess what, no one else has a CloudOS. Same story for infrastructure.


      And it does not matter that behind the scenes – there is nothing really  new that HP and Red Hat have created – they just bundled existing offerings together. But there is a value for customers from bundling, as the expectation is that the bundling vendor will enable an out of the box integration of these services. And the bundling vendors of course want to bundle with higher ground offers, that make their product unique and easier to differentiate and sell. So Red Hat of course uses Red Hat Enterprise Linux and HP will enable their moonshot servers soon, as the hardware platform for HP Cloud OS.


      Exhibit 3 – Partnerships game heats up

      In the last keynote of the HP Converge conference – normally these closing keynotes are boring, wrap up the message affairs – HP’s COO Bill Veghte unleashed a zinger for the cloud  market, mentioning that Workday was  moving to the HP Converged Cloud. A huge move – even in cloud terms, but Veghte peppered it even more disclosing, that Workday would be leaving their existing partner – and HP competitor – AWS.


      Only one journalist (@StevenJBurke  kudos!) pickedup on this – I guess the rest was gone – and the replay of the keynote has not been made available by HP - yet. So the news only slowly cooked up – and prompted a re-commitment of Workday to AWS. 


      The sign of maturation in the cloud market to look for here is, that there are less prized possessions to claim for the infrastructure players in order to make their otherwise boring offerings more attractive. Expect more tug of war between the infrastructure vendors trying to get more of the prized and recognized SaaS vendors to adopt their cloud offerings.

      Advice for cloud consumers

      This is a great phase for the market to be in and for you to make you first steps to the cloud or to double your investments if you already started. The vendors will vie for your business and offer it to you at most attractive terms, since they try to  fulfill their sales quotas that are ambitious. The risk is that you may pick a partner who will no longer be in the game in the next phase or market maturation.

      Advice for cloud market players - infrastructure

      You need to have a sound partnership strategy in place up the cloud stack, a pure acquisition strategy will not be enough in the longer term, unless you are really, really deep pocketed.

      Advice for cloud market players - platform

      Time to cast your strategy – will you be a open vendor that will try to partner with a lot for the infrastructure vendors, or do you pick a single, or better only a few of the infrastructure players? Equally you need to look up the stack to make sure you are not being shut out of more higher level positioned opportunities in the cloud stack.

      Advice for cloud market players – SaaS

      If you have your own infrastructure – keep evaluating it from a cost perspective. After cloud consumers you are the most attractive group of prospects in the cloud market. If you partner – re-assess your partners in terms of cost effectiveness and next market phase survivability.

      Advice for cloud market players – Services

      Your services are the glue keeping the market together. Try to move up to the cloud stack where the more lucrative opportunities are – and those engagements that determine the utilization direction down the stack.


      MyPOV


      Great phase for the cloud market – it is graduating from an early interest phase to a player competition for customers. Challenges exist for consumers of cloud offerings to bet on the winning horses, and for cloud vendors to become and stay a horse that is in contention. Exciting times.  


      0 0

      I have previously shared my not so optimal experience installing Office365 from Microsoft, and raised my concerns on the status of customer service in this post. But the OfficeSaga Part 1 kept giving through the last week - so it compelled me to write a little more on the state of multi-channel CRM in 2013 - which seems to be pretty sad. Here is the new storify collection


      During the heyday of CRM in the late 90ies of last century, it was all about the chase to treat customers consistently across interaction channels and across organizational functions. The demo of the informed sales rep, who aware of a customer service / customer support situation, steers to a sensitive customer interactions, was seen at every CRM show / demo. Likewise the mirror scenarios - where customer service professionals are made aware of impending sales and treat the customer accordingly. Seen too often to ever forget. 

      In 2013 the situation has gotten a little more complex - as customers can not only be interacted with face to face and through the phone, but they may also show up at your web store and in the social media. But still the promise of multi-channel - or often as a buzzword now omni-channel - CRM is that the customer will be treated consistently across the interation channels and all actors on the enterprise side are informed across organizational boundaries.

      The test: Install Office365


      As mentioned I chronicled already my close to 24 hour challenge to install Microsoft Office. But the case was closed with my 3rd install - with the help of Microsoft 3rd party support. But in the week after the surprises started.

      In-Function Disconnects


      I was surprised when the first 3rd level support consultant was following up with me on the successful install of Office365. That would be great customer service if indeed I had not been successful installing Office365 - but I was successful. And I referenced the case number in the interaction with the second 3rd party engineer.

      But ok - giving the benefit of the doubt I replied with thanks and good news. After all my lesson learnt from this is - get to 3rd level support asap - good to have a relationship with two 3rd level support professionals at Microsoft.

      And then witness my surprise that the same engineer followed up again - a day later with the same question - if all was good with my Office365 installation...  at this point I decided to no longer reply.

      What should have happened in perfect CRM? The agents should have looped back with me if my Office365 is running well now. They also could have called me - as they have all my phone numbers - even more personal, but ok.

      Surveys are great - they need to work

      The OfficeSaga Part 2 got even more lively - when I started to receive links to feedback surveys. 

      Great practice - only the first two links didn't work. Using Chrome first I was suspecting a potential Mirosoft issue and tried IE - also no luck. Back to Twitter and tell @Office - and what do I get on Twitter - the next non working link. Than nothing.

      The next day I get my 3rd request to fill out a feedback survey - which then worked - both in Chrome and IE. 

      And it sounds plausible to test links to sites / surveys before you send them to customers. And if you have a Twitter conversation - granted it's a challenged one due to the 140 char limitation - don't end it - drive it to closure.

      More in function disconnects

      The highlight and by now the last one I hope - was getting a call from a sales rep for the Small Business Division, if I wanted to buy the Office365 version for small business, as my free trial had expired. 



      Looks like a very good sales practice - only I had purchased one year subscription already when I switched laptos - pondering the eventuality that I was not allowed to switch machines during the free trial period... 

      And similar like the service rep who should have known that the case was closed - the sales rep should have saved the call to me as I had purchase Office365 already.


      Advice to CRM users

      If you sell and service customers across channels - do so consistently. If you haven't recently - test your systems from the outside - and hopefully you do not run into negative surprises.

      Advice to CRM vendors

      Check if you multi-channel story is complete and working. Do all users interacting with a customer have consistent information and access to the customer's past interactions?  Check your timed actions - are they still in synch with business reality? Can users quickly validate the latest status of a customer before interacting with the customer?
      aaa

      MyPOV

      It looks to me that the state of multi-channel CRM is in a more dire state than I thought. If an enterprise like Microsoft, who is also a vendor of CRM systems, does not have a best in class implementation of CRM - hab bad may this be with regular end users? 
      And yes - consistent multi-channel CRM is hard - but what customers expect and deserve in 2013.  Time to make it real.

      0 0

      Oracle release it's Q4 numbers and in the earnings call there were the usual strong but entertaining statements on achievements and the competition. Andrew Nusca over at ZDNet has done a great job to extract the 25 striking things from that call, you can find it here. You can find the webcast here and a transcript here.




      During the call Larry Ellison also made a preview of events scheduled for next week in regarding of the next Oracle database release, Oracle 12c. Now, Oracle 12c was announced back at OpenWorld in 2012 - as the first pluggable database that would separate user data from metadata and allow multiple tenants in the same database.


      Multi-tenancy Confusion

      There is  now some confusion around the term multi-tenancy. Though in general there is agreement that multi-tenancy means the co-existence of multiple tenants for shared resources - there are now two interpretations of the term.

      The classic multi-tenancy term was related to the database sharing data elements (or records) across tenants. That design was critical for the first and early SaaS vendors - as they needed to share precious data base resources. Often this is referred to as a tenant striped database. 

      The Oracle view on multitenancy is that user data becomes the tenant - and as you can run multiple user data stores (or containers as Oracle calls them) in the same database - you have a multi-tenant database. Oracle complements this by separating the metadata from the user data and can point multiple user data stores to a common set of metadata, thus achieving better hardware utilization and with that better elasticity of the database. Or in other words - you can run more database on the same server with 12c.


      The key advantages of 12c

      As already mentioned above, the separation of the user data from the meta data allows 12c to use less resource than its predecessors and with that a better hardware utilization. Better hardware utilization with putting  more user data on the same machines is better elasticity of the offering, which is key for anything cloud these days.

      But there are  more key advantages to this tenant concept. First of all, the standard tool you may want to run on the database are still available to run - with no changes to the the security model. A BI tool like e.g. .SAP's Business Objects can just run on the 12c database - with no modification. To the user its just looking at the database as with no multitenancy. In the striped multi-tenancy case the unmodified tool would give access to all tenants data - something clearly not desirable if you want to stay in business as a SaaS vendor.

      Moreover you can move and copy the user data more easily. And you can change the schema both on the metadata and user data separately - and then just point upgraded versions to the right partners of metadata and user data. Big advantage for upgrades and high availability.

      And finally most enterprise software needs some way to customize it. With the striped multi-tenancy model this was very limited - as all tenants were on the same schema. With the new multi-tenancy architecture - more can be done to the individual schemas of a tenant. Theoretically anything and independent from the tenants - but of course with a price when upgrading, no discussion needed.


      Is this new?

      Not really - most SaaS and PaaS vendors today will store one tenants data in a separate database, often even on separate servers. The advantages are  mentioned beforehand - and often database scalability even drives to that design (more about that next week). But these vendors pay the price with a higher operating cost -- all their databases run with the overhead of a one to one relationship of user and metadata - which results in a larger footprint and with that higher cost to operate and less elasticity. 

      Oracle's innovation is to provide the separation of meta data and user data from each other and achieving better elasticity for the offering. If Oracle will be able to make the upgrade to 12c transparent in the sense that a pre 12c Oracle database user may take advantage of 12c easily and e.g. be able to unstripethe older usage - will remain to be seen. It will make adoption of 12c much easier.  


      Big expectations

      Larry Ellison mentioned events next week to provide more details on 12c - and the endorsement of NetSuite, salesforce and... Microsoft. And while NetSuite was hardly a surprise - salesforce was more surprising. But Ellison had almost kind words for Marc Benioff. Now what Microsoft may do here - will be very interesting - expect lots of speculation till the event.


      MyPOV

      Oracle was not shy to tout 12c back at OpenWorld and now in the Q4 earnings call. With missed earnings - maybe a diversion strategy - but when 12c ships - it will change multi-tenancy as we knew it. And can't wait for the partnership announcements Oracle said the company would make next week. Heightened expectations. 

      My latest take on Oracle overall can be found here - takeaways from the Oracle analyst summit. 


      0 0

      When revisiting the Oracle earnings call of this week, it is pretty obvious that Oracle is trying to position Oracle 12c pretty much everywhere as the cloud database of choice. And not only position and try to sell – but make it an integral part of the cloud tech stack of well-known partners (NetSuite), lesser known partners (salesforce) and even competitors (Microsoft).






      So here is what Ellison said(courtesy of SeekingAlpha, emphasis added) in the Q4 earnings call last Thursday:


      Next week, we will be announcing technology partnerships with the most important –the largest and most important SaaS companies and infrastructure companies in the cloud. And they will be using our technology, committing to our technology for years to come. That’s how important we are doing 12c. We think 12c will be the foundationof a modern cloud where you get multi-tenant applications with a high degree of security and a high degree of efficiency, you at least have to sacrifice one for the other.

      Again, I would call them a startling series of announcement with companies like Saleforce.com, NetSuite, Microsoft all that happen next week will give you the details. These partnerships in the cloud I think will reshape the cloud and reshape the perception of Oracle Technology in the cloud. 12c in other words is the most important technology we’ve ever developed for this new generation of cloud security.


      So let’s dissect and interpret this: Ellison makes it very clear that the aforementioned SaaS and IaaS companies will be using 12c for years to come. The design point of separating user data from metadata is the key architectural change of Oracle 12c from previous versions of the database. And he clearly mentions long term partners NetSuite and Salesforce, but also usual foe Microsoft. So what is going on?


      The Cloud market matures

      As Constellation Research has shown last week with our post The cloud is growing up – 3  signs from the news (see here)– the cloud market has entered a 2ndphase in which more vendors compete for less demand and at the same time need to accelerate their offerings – through acquisition (e.g. IBM buysSoftLayer), through bundling(HP announced Cloud OS) or partnering (e.g. Google andRedHat). And we have the most unlikely combination of partners now most likely working on a blended cloud technology tech stack.


      Oracle’s ISV business

      Let’s not forget that Oracle’s ISV business is an integral part of the Oracle revenue. And that for most of the last quarter century the largest Oracle ISV has been... SAP. So Oracle knows how to make partners successful on its database. And contrary to public perception, we are sure when the call was placed to 500 Oracle Parkway from One Microsoft Way in Redmond, Oracle was listening.


      The other remarkable aspect is that now in a span of 20 years – back then it was Hasso Plattner with his decision to run R/3 development on Oracle’s database  - and now Steve Ballmer (and maybe even Bill Gates) – choose Oracle as their strategic partner. Very, very few technology companies can muster that test over a 20 year time range.


      Microsoft’s problem

      The root cause for the expected Oracle Microsoft partnership lies deep in the history of Microsoft technical decisions. When it was clear that Microsoft needed a SQL database and it then partnered with Sybase – it made the decision to run SQL Server on the Windows technology stack – and only there. And that limited the number of cores that were supported and allowed the database team to – let’s be polite – not address scalability issues in the best way.


      All this was hidden while the world was running applications on premise. And it was also hidden as long as the load on the database server side was manageable. Ever wondered why the Microsoft enterprise applications only had a SMB focus? And why Microsoft ran internally on SAP?


      So this will be a key case study how platform decisions and technical debt can creep up on even one of the largest and most successful technology vendors. But kudos go to the Microsoft executives to as it looks like really jump over their shadows and address the technical issues through a partnership with Oracle.


      Virtualization layer complications

      So where will be the line in the sand between Oracle and Microsoft IP and products? Next up from the database in the technology stack – you will hit the virtualization layer – and here Oracle and Microsoft have their respective own offerings with Oracle VM and Hyper-V. We expect this is where Microsoft will draw a line and Oracle 12c will have to find a way to support Hyper-V.


      At the end of the day this is a reasonable architectural fault line – as it protects the Microsoft application code to become virtualization layer agnostic – while it requires Oracle’s database to become compatible with different virtual machines. And this makes sense for Oracle as it comes back to its DNA as partner for ISVs – with the virtualization layer becoming something similar to the ODBC of the cloud age.


      At the same time it gives Oracle the chance to optimize a little better with its very own Oracle VM – which will be key to pitch for the overall Oracle tech stack to the many ISVs, who do not own a virtualization offering themselves.


      So this would be a reasonable compromise which ultimately is a win win for both sides, though short term it will but some architects in Redwood Shores in high gear.


      Did Microsoft have options?

      The only other real option that Microsoft could have looked at would have been IBM. And IBM would in general have been a more compatible partner than Oracle – at least from the general outside perception. And though this is speculation, Constellation is sure that Microsoft will have done some due diligence on Armonk’s DB2.


      And then Microsoft could have gone more radical by e.g. looking at using Hadoop as a conventional data store (see here) – but that would have most likely pushed the limits a little too aggressive… but for a second - think of storing all the information that Microsoft applications use and create in one single and consistent data store. Not a solution for 2013 – but for 2014+. Obviously Microsoft’s need was much more immediate – like to run the Dynamics applications and get SaaS market share.


      So why Oracle?

      We don’t have the details, but the savings that the Oracle 12c database achieves by de-coupling metadata from user data achieves must be so impressive, that they even convinced Microsoft to partner. We cannot think of many other and better benchmarks for Oracle 12c.


      And while the hint of NetSuite adopting 12c is not surprising – the adoption by Salesforce are another proof point of the achievements Andy Mendelsohn and team have put in place with 12c.


      Both Microsoft and Salesforce know the SAP story – and how that 20+ year ago decision of Hasso Plattner to build R/3 on Oracle has shaped the Oracle, the SAP and the RDBMS markets and ecosystems. We are certain Microsoft did not make this decisions light heartedly. And surely Salesforce may have wanted to rid itself of the Oracle dependency. But ultimately the cloud business is all about cost of ownership – and if someone has a silver bullet – you need to have it, too – or your days may be counted as a competitive cloud vendor.


      Oracle deserves credit that – again contrary to widely held public perception – 12c is available for partners, even widely perceived competitors – alongside their internal development of Fusion Applications. All rightful concerns of Oracle not supporting the platform for their own advantage – need to take a pause.


      And we are really curious where Oracle and SAP are on bringing the SAP products to 12c.


      Advice for customers

      This is good news for Oracle and Microsoft customers. Microsoft customers get a scalable database under the Microsoft SaaS applications, Oracle RDMS customers get more usage of 12c and another way to build applications for 12c. And at the same time Oracle’s tech stack and applications teams have now an external benchmark that they need to be better at building on top of 12c than their relative competitors. So as a customer – wait, see and validate the expected benefits.

      Advice for partners

      For a Microsoft partner – this makes your business more viable in areas where before the sizing teams would have cringed and where the hardware cost could have been prohibitive. For the Oracle database partners this expands the addressable market. And for ISVs in general this is great news – as you may now have the choice to develop in Java or C# - with the latter no longer being limited by database capacity. You still may take a dependency on the cloud technology stack you will be using, but when HyperV will be supported by Oracle 12c – it may be an option to run your C# applications on an Oracle data center.


      MyPOV

      We congratulate both companies to the partnership and see this as net positive – Oracle is true to its technology partner foundation and Microsoft has solved a long term tech stack weakness that is exposed by the nature of the cloud. It’s now execution time for the technical teams and we look forward to learn soon about the first product and customer proof points – maybe as soon as the Build Conference this coming week.



      The only negative: We are sad that one of the best April Fools headline is gone forever … 


      0 0

      This is a joint post with my colleague Alan Lepovskywho looks at IaaS/PaaS and Future of Work technologies for Constellation Research. You can find this here, too.

      Late last week the news came out, that Intuit has acquired Elastic Intelligence, maker of the Connection Cloud product - one of the few cross platform, cloud enabled BI solutions in the market. Intuit will use Connection Cloud to complement capabilities of their QuickBase solution.


      We look at this event from the Future of Work, Data to Decisions and Consumerization of IT perspective - respectively through the lenses of our analysts Alan Lepofsky and Holger Mueller.



      The Collaboration Take


      One of the basic tenets of enterprise collaboration software is that it allows people to work together to achieve a common goal. Example goals include planning an event, creating marketing material, closing a sales deal, or any one of a thousand other use-cases where people work together to get their jobs done. Without a consistent structure for entering information, the data in these collaboration platforms becomes difficult to search, filter and report on. Products that use form-based entry solve this issue by having people enter information into specific fields rather than into a blank wiki page, blog entry or community forum.


      Intuit QuickBase has been around for more than a decade, allowing people to create applications without having to be an application developer. The acquisition of Connection Cloud and its future integration with QuickBase should allow people to integrate data from other enterprise systems into the QuickBase applications they create.


      For example, an organization may be able to create an application for the Sales team that pulls in data from both their CRM and their ERP system, allowing them to get an account overview that is not available in either of those systems on its own.


      The SaaS Take

      Software as a Service (SaaS) is the growth engine for enterprise applications in general. The unique nature of QuickBase is its capability to get end-users to build and maintain surprisingly elaborate business application. The extensive library of partners and building blocks gives QuickBase users a powerful but end user manageable arsenal of onality.


      SaaS vendors need to continuously expand their capabilities and the addition of business intelligence functionality is a key value add for QuickBase.


      The BigData Take

      One of the biggest challenges for enterprises today is how to create value from big data projects. Though Connection Cloud does not necessarily fall under a big data play - the result of using the product can likely result in one. With the capability of using many of the leading SaaS OLTP products as a data source, Connection Cloud is one of the few products to provide out of the box cross SaaS product business intelligence... and with the combination of multiple OLTP sources - data volumes could quickly  move to (lower end) big data volumes.


      It will be interesting to see if Intuit can capitalize on the big data trend - especially in the light of maintaining end user ease of use.


      The  Enterprise Take

      One of the most interesting developments in enterprise applications has been end user programming - for quite some time now. No vendor has really tackled the challenge with a workable solution - but Intuit is one of the closer vendors to successfully address the topic with QuickBase.


      Through the combination of the existing applications and the capabilities of Connected Cloud, which enable more business intelligence content, Intuit’s lead in this area will be solidified. And it makes a whole new set of applications possible. While previously all OLTP vendors had a  lock on BI and reporting solutions to run on their own application and product framework - it may now be possible to build QuickBase applications on top of that. This gives QuickBase a new value proposition to build applications.


      Another possibility is, that Intuit will not use Connected Cloud solely for the pedestrian reporting and BI needs - but to extract more data from the SaaS OLTP applications. This would make the integration of QuickBase with SaaS OLTP easier and again open new dimensions of QuickBase application scope.


      However, like before - Intuit will have to address the write back problem. Right now QuickBase makes it easy to build one way applications, in the sense that you take data from another system, import it and work on it in QuickBase.


      Likewise it supports island applications with self contained data storage in QuickBase.  What Intuit needs to address are circle applications that will allow users to start in 3rd party applications, provide value through a QuickBase application and then return that back to that (or another) 3rd party application. Or better for QuickBase - start there, hand over data to 3rd party, process something there, and then return to QuickBase. It  matters in enterprise applications where business processes get started and lead to final outcomes.


      Advice For Customers

      This is good news for QuickBase customers, who get key capabilities added to the product. It’s time to re-evaluate scope of your existing QuickBase applications and see how the additional capabilities will add value to these solutions. Likewise, with the expanded capabilities it’s time to see which new applications you may decide to build with QuickBase.
      On the flipside - the formerly amicable relationship between Elastic Cloud and SaaS OLTP vendors, may now change given QuickBase’s competitive status for the overall enterprise applications landscape. So monitor how many connections the Connected Cloud product will have when run by Intuit.


      Advice For Partners

      This is exciting news and as customers revisit their application portfolio, you should review your product roadmaps and service offerings. How can the future additional capabilities of QuickBase make your offerings stronger and more attractive in the market, how can these capabilities help address new automation areas? These and similar questions should be addressed quickly.


      Advice For Competitors

      You should not be surprised, as Intuit will keep investing into QuickBase. You will need a strategy to address the power of end user programming and the disruptive nature of that trend to the conventional enterprise applications space - may they be SaaS or classic on premise applications. Intuit (and others) have not fully figured out this one yet - but someone will come along sooner than later and you need to be ready. While Elastic Intelligence enhances Intuit’s integration capabilities, they are still weak on collaboration/enterprise social networking features. In today’s “social business era”, vendors than provide a strong set of collaboration features such as Liking, Commenting, Rating and Sharing should utilize this as a competitive advantage.


      Advice for Intuit

      This is a great move, you will have to make sure you integrate the more complex BI capabilities in a user friendly way into QuickBase - and make them configurable, usable and extendable by a skilled business end user. Likewise you need to address the circle natured applications we mentioned earlier. Intuit’s next acquisition should focus on improving their enterprise social networking capabilities, enabling people to create, collaborate on and share information in QuickBase applications.


      OurPOV

      A good acquisition by Intuit, that helps further differentiate QuickBase. Adding new additional scope to its current application scope with business intelligence capabilities, provides  QuickBase with extended usage attractiveness to both customers and partners. Keeping QuickBase easy and intuitive to use is the emerging challenge. We look forward to hearing more details on roadmap, pricing and availability.

      0 0

      This is a joint post with my colleague Ray Wangof Constellation Research.


      At a press conference on June 24th, 2013 with Microsoft’s CEO, Steve Ballmer, Oracle’s President Mark Hurd announced a cloud partnership where Windows Azure and Windows Server Hyper-V customers will be able to run Oracle software (including Oracle Database (no version mentioned, but Constellation expects this to be 12c)), Oracle Weblogic, and Java.





      Oracle also announced availability of Oracle Linux for Azure customers. Constellation believes that the deployments of the Oracle 12c, Weblogic and Java stack pieces will be deployed on Oracle’s Linux.  Should this be true, the approach makes sense, as this is a tested and proven hardware and software combination. Further, Microsoft has already begun to run parts of Azure on Linux.


      The partnership alliance poses significant implications for both vendors and more importantly customers moving to the cloud for three reasons.


      • Java comes to Azure, a sign of pax in the .NET vs Java wars.  For Applications to run on Azure, they needed to be built in CLR generating programming languages. Now, with the licensing of Java by Microsoft as part of this partnership, Java applications will run on Azure. This opens doors for Java applications on the Azure cloud, as well as general more portability for Java applications. And Azure becomes a friendly cloud for the 9 million+ Java developers out there.


      Point of View (POV): Microsoft and Oracle strike a win-win here.  Microsoft gains more language derived potential for expanding Azure and Oracle adds a marquee cloud stack to support Java.  Given the substantial overlap of enterprise customers on both Microsoft and Oracle, customers will benefit from more cross cloud compatibility for Java while supporting Azure for IaaS.


      • Azure will run Oracle Weblogic and the Oracle Database.  Microsoft will support Oracle Linux in Azure as the foundation to run the middleware and the database stack.  Thought the press release and the press conference did not specify which Oracle database, Constellation speculates this is for Oracle Database 12c. In addition, Oracle announced license mobility for customers who want to run software on Azure and bring Oracle Linux to Azure.


      POV:  Interesting enough when Larry Ellison spilled the news for this announcement during the Q4 Oracle earnings call, this was not about the Oracle Database, but very specifically about Oracle 12c. It’s not clear why 12c is not specifically referenced in the press release - but with the ORacle 12c general availability slotted for June 25h, 2013, this moment may not have been the time to steal the thunder.

      Of note, it is not only the database, but also the Weblogic application server which will be deployed on Azure. This comes as a surprise at first, but given the work Oracle has done to integrate the former BEA flagship product with 12c and Java - it was a question of taking whole technology building and avoiding too many interfaces. Why run Java apps through Biztalk to an Oracle database?


      Constellation views this as a smart move by both companies, as it allows Azure customers to utilize more of the Oracle products, that are more and more entwined due to the Fusion and Exaxxx products.   


      • The hypervisor is where Microsoft and Oracle draw a line in the sand
        Oracle will support Microsoft’s hypervisor Hyper-V to be the demarcation line between higher level application code and the Oracle products that now run in Azure.  The combined offering will be running on Hyper-V, which creates some headaches for Oracle on the hypervisor level as Constellation
        predicted, and will be supported by Oracle support as running on Windows Azure.


      POV: This poses some engineering work for the Oracle hypervisor teams, but nothing impossible to achieve. And the benefits are tangible, Hyper-V built applications will now be able to run on the Oracle Database (12c, and on Oracle Linux). This will give a lot of performance critical (think Dynamics) applications that were limited by SQL Server scalability before, new breathing room.

      Microsoft was able to protect higher level applications of its technology stack with this agreement and at the same time Oracle benefits from a whole ecosystem of Hyper-V compatible applications. The cost of supporting Hyper-V for Oracle, which is tangible, is however dwarfed by this additional market potential. And it gives Mircosoft an important leg up against VMware’s vSphere.  Constellation believes this has significant implications in the cloud stack wars among Amazon, Google, HP, IBM, and VMware.


      In unusual candidness for these kind of partnerships, Oracle listed the current and future deliverables for the alliance in an blog post here.

      Why did this happen?

      As previously mentioned, this would have been a very good April Fool’s headline - even back on April 1st 2013. So this alliance comes as a surprise pretty much to all industry observers, at least we have not seen anyone claiming to see this one coming.


      Constellation can only speculate what has driven Oracle and Microsoft to become frenemies and co-opitors. But the usual drivers are customers and technology. Customers could be the biggest driver for this alliance (e.g. a large public sector client that has standardized on Azure but requires Oracle, maybe for security or scalability reasons).  Why? Oracle has achieved significant and game changing elasticity through the de-coupling of metadata and user storage in Oracle 12c. In the due diligence process Microsoft must have looked at this design point and it must have been clear, that SQL Server would not be able to match this. It will be interesting to see in the months to come, what the real drivers to this alliance have been.


      Lastly it necessary to mention that primarily Microsoft, but to a certain point also Oracle are interested in differentiating their cloud offering versus Amazon’s AWS and Google’s GCE. And this alliance certainly helps in this process.

      Implications for the market

      Over the past decade, Oracle has emerged as the laggard in the cloud market.  VC’s had advised their startups not to build on Oracle to avoid the cost overhead and legacy database technology.  Yet Larry Ellison remains the rare master of Sun Tzu’s Art of War strategies.  In this latest effort, he shows his determination to serve as the arms dealer for cloud infrastructure.  Announcements on partnerships with Amazon,  Dell, now Microsoff and soon with Salesforce.com and NetSuite show his determination to remain relevant in the cloud, though very late to the party.


      The irony is that it all comes back to the original view of Ellson - that the cloud is nothing else than servers connected to the internet. And to a certain point that is what the Oracle Linux machines with running Oracle 12c, WebLogic and Java will do. Only they will be more elastic than other commercial database offerings, but we will have to see what happens on more detail at the 12c announcement tomorrow.  


      For the overall cloud market this forms a positive development as amongst the dedicated cloud stack vendors - AWS, Google, Microsoft and Oracle - this forms a level of reuse and commonality that previously has not been thought to be possible. Java applications now run on all of the four aforementioned cloud stacks. The Oracle database runs in all but Google. As does Oracle Linux (we assume that’s also how AWS deploys Oracle). So we are not at all at a time of interoperability - but this alliance is certainly propelling the cloud further in these terms.


      The Bottom Line: The Irony is the Database Back Again?

      At the end of the day two veterans of the enterprise software industry, Hasso Plattner and Larry Ellison are re-inventing their companies through database innovations. It looks like enterprises still want and need to store data reliably and efficiently. May it be in memory with HANA or may it be with better overall elasticity for 12c. No mention of cloud. Remarkably both innovations would have been beneficial for their respective companies even in a pre cloud era. So yes, the database is back. And with that a chance to rebuild and re-invent the whole enterprise technology stack upwards.  



      Our POV: The Cloud Wars Have Just Begun, Customers Poised To Win

      This is the positive announcement expected over the weekend. The cloud sure makes strange bedfellows, and is the real driver and winner. Both Messrs. Nadella and Hurd clearly identified that. With the addition of Java to the overall mix, more interoperability has been achieved than customers would have expected and overall this is good news for the cloud, and more importantly, for Microsoft and Oracle’s customers and partners.


      Before customers can rejoice, availability, pricing and customer successes must come first.


      0 0

      This morning, as expected, the next partnership announcement of Oracle came out, after Monday's partnership announcement with Microsoft (our analysis here and here), it was Salesforce.com's turn today. It was expected to happen this week as Oracle's CEO Larry Ellison pre-announced these partnership during Oracle's Q4 earnings call last week. 




      Here are the takeaways from the short press release: (all emphasis added)

      Takeaway 1:

      Salesforce.com [NYSE:CRM] and Oracle [NASDAQ:ORCL] announced today a comprehensive nine-year partnership encompassing all three tiers of cloud computing: Applications, Platform and Infrastructure.

      MyPOV
      This makes the announcement's scope larger than the one with Microsoft yesterday - as it not only includes infrastructure and platform, but also applications. Applications were not mentioned yesterday, so there were clear implications to the application space with the added capability to deploy Hyper-V applications to an Oracle WebLogic and Database stack on Azure. And makes sense as salesorce.com is  much more a SaaS company than the  more IaaS and PaaS centric Microsoft Azure offering, that was the partnering product yesterday. 


      Takeaway 2:

      Salesforce.com plans to standardize on the Oracle Linux operating system, Exadata engineered systems, the Oracle Database, and Java Middleware Platform. 
      MyPOV
      Oracle Linux gets a lot of work in the near future, being the OS of choice to deploy the Oracle technology stack, both for Microsoft Azure and the Salesforce.com cloud infrastructure. Likewise the Oracle Database will power both, we think starting with Oracle 12c and that release enabling later mentioned benefits and its general availability happening any day now. 

      The difference to the Microsoft announcement lies in the commitment to Exadata, but that again should not surprise, as salesforce.com's cloud infrastructure (so far?) has been designed around very large database servers. This is the sweetspot for the Oracle engineered systems, not so much the lower end that is used in most IaaS offerings. 

      And here salesforce.com will make different decisions than Microsoft, since salesforce.com is much more a SaaS offering and Azure much more an IaaS offering. And lastly the Java Middleware Platform is mentioned. Not Weblogic like in the Microsoft alliance. But then Azure was all about Java support - and with salesforce.com we do not see any reference to programming languages. And is not too surprising - as salesforce.com supports Apex and Java byte compatible programming languages with heroku. 

      Maybe salesforce.com maybe moving heroku pieces over from AWS to the new Oracle based salesforce.com cloud platform? Potentially salesforce.com can now again unify its platform. 


      Takeaway 3

      Oracle plans to integrate salesforce.com with Oracle’s Fusion HCM and Financial Cloud, and provide the core technology to power salesforce.com's applications and platform. salesforce.com will also implement Oracle’s Fusion HCM and Financial cloud applications throughout the company.

      MyPOV
      And here we come to the application aspect of the announcement. The integration of salesforce.com CRM products with Fusion HCM and Fusion Financial Cloud was a surprise. This maybe the reason for the inclusion of Java Middleware Platform in the announcement, as this integration is clearly in the responsibility of Oracle. And Oracle certainly wants to use their homegrown and standard Fusion integration products with Java Middleware Platform. 

      In return salesforce.com will implement Oracle Fusion HCM and Financial Cloud, making itself a key reference for the integration. This is a bold and disruptive step replacing Workday for HCM and taking salesforce.com away as potential reference case for the salesforce.com ecosystem, as e.g. Financial Force was mentioned as an option for salesforce.com future finance automation back at Dreamforce 201. And it is a Fusion showcase in itself, too - as salesforce.com uses Oracle Financials today. 

      But it raises questions for the future of work.com, the recent HCM acquisition of salesforce.com And gives salesforce.com potentially the much needed, cloud based, out of the box integration to an ERP package. 

      Though not announced, this could setup salesforce.com as giant re-seller of Oracle Fusion Apps. Other ERP vendors (SAP!) would no longer be able to marginalize Salesforce.com as a CRM only vendor. The analogy to Siebel Systems, that ultimately ran out of roadmap and products to sell, and led to its demise and to the Oracle acquisition - would also be addressed for salesforce.com 


      Takeaways from the Quotes

      “Larry and I both agree that salesforce.com and Oracle need to integrate our clouds,” said Marc Benioff, Chairman and CEO, salesforce.com. “Salesforce.com's CRM integrated with Oracle’s Fusion HCM and Financial Cloud is the best of both worlds: the simplicity of salesforce.com combined with the power of Oracle.”

      “We are looking forward to working with salesforce.com to integrate our cloud with theirs,” said Larry Ellison, CEO, Oracle. “When customers choose cloud applications they expect rapid low-cost implementations; they also expect application integrations to work right out of the box – even when the applications are from different vendors. That’s why Marc and I believe it’s important that our two companies work together to make it happen, and integrate the salesforce.com and Oracle Clouds.”

      “With over 1 billion complex transactions delivered every single day, an Oracle Linux and Exadata Infrastructure will make salesforce.com a more efficient company– and our customers will benefit,” said Parker Harris, Co-Founder and Executive Vice President, salesforce.com. “Deploying Exadata engineered systems throughout our data centers will allow us to significantly lower overall hardware, floor space and energy costs, while simultaneously providing our customers with higher performance and better reliability.”


      MyPOV
      Ellison and Benioff mention integration four times in their short statements, Larry beating Benioff 3:1, not surprisingly as Oracle has been honing the integration message much longer than salesforce.com. But we think that both CEOs are on the right path - customers expect simplicity (Benioff), rapid low cost implementatoins and out of the box working integration (Ellison).

      Oddly Parker Harris gets an unusual 3rd quote in the short press release, as these are usually balanced afairs - but he raises the key value drivers that in our view have led to this partnership: More efficiency and ultimately lower Total Cost of Ownership (TCO) to run a cloud infrastructure.

      So why?

      Oracle must have something, that these days every cloud company seems to want - a reliable database, an attractive technology stack and first of all a very attractive TCO. 

      At OpenWorld in 2012, when Oracle unveiled 12c, this slide caused some uproar in the database community:

      There was a reasonable debate if the use case was realistic etc - but at the end of the day, if with 12c savings from the confusingly multitenancy labelled feature are only 20% of what Oracle claimed, then Oracle 12c is a huge TCO saver. The famous no-brainer to implement. And very compelling for Oracle customers (like salesforce.com) or vendors with a database problem when turning to the cloud (like Microsoft). 

      And while the rumor is out there, that this is a 9 year deal and salesforce.com is paying something in the area of 300M US$, it's still a good deal financially for salesforce.com. Quick back of napkin calculation: 9 years are 108 months, let's make it a 100 months for easy math, meaning salesforce.com pays about 3M US$ per month for using Oracle's Database, and more. Assuming it would just be the database - not a bad number, if you look at the alternative (see below). And salesforce got more - Java Middleware, probably some Exadata and possibly even the usage of the Fusion Apps. 


      The alternatives for Salesforce.com were limited

      Of course salesforce.com could have staid where they are today. But then it would have surely spend more on running its current cloud technology stack. And the Oracle part of that would be aging quickly. And salesforce.com has always been on the lastest Oracle database releases as soon as they could be confident to run the release. And there is a benefit to be current.

      From the scale that salesforce.com runs - with north of 1B transactions - the only option would have been IBM. But similar to Microsoft  which faced a similar due diligence questions - it ended up with Oracle. Like in yesterday's post - the praise goes to Andy Mendelsohn and his team.

      Market implications

      Oracle gets a design and marquee win - the largest SaaS vendor in the market. And a potential reseller or joint sales engagement partnership at SAP customers, where salesforce.com is very successful at selling into. At the same time Oracle is the database of choice in all large clouds, but one (Google). 

      Salesforce.com can put away some key technology decisions, they are taken for the next 9 years and likely longer. And it gets an cloud ERP option to counter enterprise scale arguments in competitive engagements with SAP. Personally I would expect the salesforce.com account manager to have the roadmap for the joint salesforce.com CRM and Oracle Fusion App in every slide deck when competing with SAP. 


      Customer implications

      When two larger cloud players agree to better integrate their products their customers win. When they choose solid technology as their foundation, and when the products of that foundation get hence more usage - customers win again. 

      It will even give SAP and Infor customers (just to mention the other 2 players out of the Top 4 enterprise application vendors) - more options on what to deploy for their enterprise automation products. And an overall positive trend when the CEOs of two key cloud players see lower operating cost and pre-built integration as key trends they need to address going forward. 

      MyPOV

      We have seen act II for the Oracle composed The cloud changes everything piece. Who would have thought the sudden amicable relationships respectively for Oracle and Microsoft and salesforce.com would come up - ever. Even though behind the scenes these companies have long standing support and development relationships. 

      The cloud seems to make a lot possible these days


      ---------------
      Have a look at my colleague's take of this, Frank Scavo - here - he sees the great detente. And I always enjoy Dennis Howlett's take on this over at Diginomica.

    older | 1 | 2 | (Page 3) | 4 | 5 | .... | 31 | newer