Every time when there is something new coming on the horizon, it’s a good time to look back. The intention of Hitachi Data Systems to acquire Pentaho is such a remarkable move that it opens up a great new chapter in Pentaho’s history.
As I just researched the details of the journey, I was very happy that you can still find the referenced postings on the Internet after all these years. All the other details came from my memories, so hope they are all accurate, especially the time of the day! And please consider that this is from my personal view and that I’m Kettle addicted, so European and German filter rules together with the priority on Pentaho Data Integration (aka Kettle) topics might apply here and there…
Sep 8, 2004 (right after dinner) – I read Matt Casters post #30 on the BIRT mailing list about Kettle version 2.0. BTW: The BIRT project was just at the beginning and it was even before the BIRT project got officially accepted by the Eclipse Foundation on Oct 6th, 2004.
At this time Kettle was closed source and I got a trial license code from Matt to play with it. Since our intention at Proratio (my employer at this time) was to use it to load the companies data warehouse from our ERP with an extreme complex condition system, the needed combination of steps would be a night mare. I asked Matt if there is some kind of plug-in possibility for Java code and his answer was about this: „This is a good idea, but since I’m doing this part time and on the weekends, I don’t know when it will be finished.“
In the meantime I developed a connector to SAP (some of the root bits and bytes are still available in these days as the godesys SAP-Connector, formerly ProERPConn by Proratio) and thought it would be a good time to check with Matt again. And surprisingly he just finished the plug-in possibility. Sometime in summer 2005 we ended up sitting together in Mainz and managed to integrate the SAP connector within Kettle in a very short time (between breakfast and lunch).
Early Dec 2005 – Matt Casters released Kettle to the open-source with a LGPL license (moved to Apache 2.0 later on). For those who would like to know more about the history of Kettle (Matt started 2001 to work on it), have a look at the Pentaho forum post Project road map, history of kettle. It contains also a zip file of the very first Java Kettle version ever and a screen shot of version 1.0. You can also read a bit more about the story in the book Pentaho Kettle Solutions.
Dec 9, 2005 (right after tea time) – I got aware of Pentaho by reading a press release in Computerwoche „5 Millionen Dollar Venture Capital für Open-Source-BI“ and let Matt know about this interesting company.
Dec 12, 2005 – Matt Casters‘ first forum post at Pentaho: „[…] Therefor, I would like to convince you to include a powerfull ETL tool that allows users to quickly build and maintain a full data warehouse. For me this means including slowly changing dimensions. Copying a couple of tables and hoping that the reporting engine will be able to cope just won’t do. Please feel free to consider including Kettle. Kettle is an ETL tool that turned LGPL about 10 days ago. […]“
Apr 4, 2006 – The press release went out: Pentaho Acquires Kettle Project – Fastest Growing Open Source BI Project Strengthens Portfolio With Best-in-Class Data Integration. This was also the time that Kettle was known as Pentaho Data Integration.
May 23, 2006 – Press release for offering Kettle (Pentaho Data Integration) trainings for Europe, starting in Mainz, Germany by Proratio. Proratio became a Pentaho partner and we did trainings and workshops in Europe.
Jun 2007 – I joined Pentaho as employee #3 in Europe. #1 was Thomas Morgner (Founder and Chief Architect of the Pentaho Reporting engine) and #2 Matt Casters. Since we had a limited head count (not only in Europe), I had multiple heads on: working on the code base, documentation, QA, support, training, prove of concepts, presales, community, even Guerilla-Marketing and may be something else that I forgot.
Dec 2007 – Pentaho won it’s first German customer – a large health insurance company
Jun 2008 – The first Pentaho community event in Mainz was a great success, bringing people together with their ideas, projects and to know each other personally and last but not least setting a foundation stone to continue and grow the community over the next years.
2008- 2015 – Read some more key points along the way from Richard Daley: The Pentaho Journey – A Big Data Booster
Feb 10, 2015 – Hitachi Data Systems Announces Intent to Acquire Pentaho to Deliver More Value From Big Data and the Internet of Things That Matter.
This acquisition builds on an existing OEM relationship between the two companies and is a core component of the HDS strategy to accelerate its Social Innovation business and become a leader in IoT. Social Innovation is the unifying strategy across Hitachi businesses to deliver solutions that enable healthier, safer and smarter societies. The Pentaho vision to create transformational value from data generated and interconnected across people and things is brought to life by a big data orchestration platform to power embedded analytics.
For further details, check out the Social Innovation News on HDS Community Innovation Center.
This is an exciting time for both companies, our customers, and partners. We both share a vision around analytics and in particular around big data opportunities.
I’m very thankful for the right decisions, the timing and passionated people all around at Pentaho, the Community and partner landscape that formed this success as a whole.
This was the short story of this great journey and it still continues 🙂