Welcome!

SDN Journal Authors: Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski, TJ Randall

News Feed Item

Pentaho Gears Up Analytics Platform for the Future of Big Data

Pentaho Business Analytics 5.0 greatly simplifies the entire analytics experience for everyone and delivers the industry's first just in time big data blending 'at the source'

ORLANDO, Fla., Sept. 12, 2013 /PRNewswire/ -- Delivering the future of analytics, Pentaho Corporation announces the availability of Pentaho Business Analytics 5.0, a completely redesigned data integration and analytics platform. Pentaho 5.0 provides a full spectrum of analytics for today's big data-driven businesses regardless of data type and volume, IT architecture or analysis required. The new, modern interface simplifies the user experience for all those working to turn data into competitive advantage. Pentaho 5.0 includes more than 250 new features and improvements. Highlights include:

(Logo: http://photos.prnewswire.com/prnh/20130912/FL78549LOGO )

Blended big data for more accurate insights

  • More complete analysis – Big data becomes more valuable when blended with operational and other data sources to provide a more complete picture of the business. Pentaho 5.0 is the first platform to enable analysts to easily blend all data types and immediately report, visualize and explore for greater insights. Learn more about the value of blending big data – watch the video.
  • Blended at the source for accuracy – Blending big data "at the source" maintains the appropriate level of data governance and security necessary for accurate and reliable analysis. In contrast, the more common end user blending "away from the source" approach lacks the ability to audit and cannot ensure correct inferences from the data. Pentaho 5.0 enables analysts to create cleansed, architected blends directly from diverse big data sources with the ease of use and real time access demanded in today's agile analytics environments. For technical details on the advantages of data blending "at the source," read Matt Casters blog.
  • Just in time blending – Analysts working in distributed, virtualized data environments need accurate, near real-time big data blends for timely and accurate analysis. Typical end-user blending requires the staging of data, resulting in data sets that are often out of date. With the big data integration capabilities in Pentaho 5.0, analysts can confidently blend all data in near real-time and immediately analyze the results. Learn more about big data use cases for blending data to improve customer experience – watch the video.

Simplified analytics and user experience

  • New Pentaho User Console and streamlined user interface – The Pentaho User Console is completely redesigned to significantly improve the user experience, so users can easily browse files, create new content, quickly access recent documents, mark 'favorites,' and more. See it here
  • Re-designed experience for administrators – A new administrator perspective is now integrated into the Pentaho User Console for a single seamless experience. Administrators can easily configure and manage security levels, licensing and servers, resulting in improved efficiency and time to implementation. See it here
  • Industry leading operational reporting for MongoDB – Pentaho 5.0 brings business analysts the most in-depth reporting platform for MongoDB, the industry's most popular NoSQL database. Read the press release, Pentaho Announces New Offering for MongoDB-based Business Intelligence.
  • Uniquely inspired custom dashboards - Executives and managers can view top-line metrics through Pentaho's new custom-designed, insight-driven dashboards, delivered directly to desktops or mobile devices. See it here

Enterprise-ready big data integration

  • Broadest and deepest big data integration – Up-to-date integrations and certifications for popular big data stores ensure businesses keep pace with ongoing changes in the big data ecosystem and are prepared for the future. New integrations include Splunk, Amazon Redshift, Cloudera Impala; certifications include MongoDB, Cassandra, DataStax, Cloudera, Intel, Hortonworks and MapR.
  • New features help IT manage huge data volumes efficiently – New capabilities like job restart, roll back and load balancing are all included in Pentaho 5.0.

Simplified Embedded Analytics

  • New REST services for third-party application developers – Pentaho 5.0 includes new REST services for simplified embedding of analytics and reporting into web-deployed applications delivered 'as-a-service'.

Pentaho's CEO Quentin Gallivan commented, "With Pentaho 5.0, businesses striving to increase their competitive advantage with big data analytics can achieve greater accuracy with much simpler, faster processes at every critical stage. Pentaho 5.0 is the result of many years of intensive planning, research, conversations with customers and industry experts and hard work by some of the smartest people in the industry. We are immensely proud to bring this new platform to market to prepare innovative companies of all sizes for big data analytics efforts today and in the future."

According to Tony Cosentino, VP & Research Director at Ventana Research, "We have identified five key analytic personas in the enterprise. Pentaho 5.0 offers benefits to all these roles and has developed functionality to help eliminate many of the common pain points that are holding back big data initiatives in companies of all sizes."

Multimedia and Resources

About Pentaho Corporation

Pentaho's integrated data integration and analytics platform prepares innovative companies of all sizes for big data analytics efforts today and in the future. Pentaho's open source heritage drives its continued innovation in a modern, integrated, embeddable platform built for the future of analytics, including diverse and big data requirements. Powerful business analytics are made easy with Pentaho's cost-effective suite for data access, visualization, integration, analysis and predictive analytics. For a free evaluation, download Pentaho Business Analytics at www.pentaho.com/get-started.

SOURCE Pentaho Corporation

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

CloudEXPO Stories
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a member of the Society of Information Management (SIM) Atlanta Chapter. She received a Business and Economics degree with a minor in Computer Science from St. Andrews Presbyterian University (Laurinburg, North Carolina). She resides in metro-Atlanta (Georgia).
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. His expertise is in automating deployment, management, and problem resolution in these environments, allowing his teams to run large transactional applications with high availability and the speed the consumer demands.
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage of data and analytics in the cloud, Architecture, integration, governance and security scenarios and Key challenges and success factors of moving data and analytics to the cloud
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.