Severity: Warning
Message: file_get_contents(http://www.geoplugin.net/php.gp?ip=216.73.216.37): failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden
Filename: controllers/Hachion.php
Line Number: 14
Backtrace:
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/controllers/Hachion.php
Line: 14
Function: file_get_contents
File: /home/laxmiveena/public_html/test.hachtechnologies.com/index.php
Line: 315
Function: require_once
Severity: Notice
Message: Undefined index: geoplugin_countryName
Filename: include/top_header.php
Line Number: 39
Backtrace:
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/views/include/top_header.php
Line: 39
Function: _error_handler
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/views/CourseDetails.php
Line: 12
Function: include
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/controllers/Hachion.php
Line: 510
Function: view
File: /home/laxmiveena/public_html/test.hachtechnologies.com/index.php
Line: 315
Function: require_once
Severity: Notice
Message: Undefined index: geoplugin_countryName
Filename: include/top_header.php
Line Number: 94
Backtrace:
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/views/include/top_header.php
Line: 94
Function: _error_handler
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/views/CourseDetails.php
Line: 12
Function: include
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/controllers/Hachion.php
Line: 510
Function: view
File: /home/laxmiveena/public_html/test.hachtechnologies.com/index.php
Line: 315
Function: require_once
Severity: Notice
Message: Undefined variable: recent_country
Filename: include/top_header.php
Line Number: 111
Backtrace:
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/views/include/top_header.php
Line: 111
Function: _error_handler
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/views/CourseDetails.php
Line: 12
Function: include
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/controllers/Hachion.php
Line: 510
Function: view
File: /home/laxmiveena/public_html/test.hachtechnologies.com/index.php
Line: 315
Function: require_once
Severity: Notice
Message: Undefined index: geoplugin_countryName
Filename: views/CourseDetails.php
Line Number: 41
Backtrace:
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/views/CourseDetails.php
Line: 41
Function: _error_handler
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/controllers/Hachion.php
Line: 510
Function: view
File: /home/laxmiveena/public_html/test.hachtechnologies.com/index.php
Line: 315
Function: require_once
Pentaho training class from Hachion helps you learn the Pentaho BI suite which covers Pentaho Data Integration, Pentaho Report Designer, Pentaho Mondrian Cubes and Dashboards. This Pentaho online course you will help you prepare for the Pentaho Data Integration exam and you will work on real-life project works.This Hachion Pentaho Business Intelligence training course equips you with the knowledge of Business Intelligence and data warehousing concepts, along with in-depth coverage of Pentaho Data Integration (aka Kettle), Pentaho Reporting, Dashboards and Mondrian Cubes. Pentaho is an open-source comprehensive BI suite and provides integration with Hadoop distribution for handling large dataset and doing reporting on top of it. This training will also equip you with the skills to integrate Pentaho BI suite with Hadoop.Big Data today is rapidly entering mainstream, and there is an urgent need for a flexible tool to address the changing requirements. Pentaho is a very versatile tool that is simple yet effective in the Business Intelligence space, and hence it is expected to grow at a fast pace. Hachion Pentaho training certification program provides great opportunities for professionals in this domain.
Pentaho user console, Oveview of Pentaho Business Intelligence and Analytics tools, database dimensional modelling, using Star Schema for querying large data sets, understanding fact tables and dimensions tables, Snowflake Schema, principles of Slowly Changing Dimensions, knowledge of how high availability is supported for the DI server and BA server, managing Pentaho artifacts Knowledge of big data solution architectures
Hands-on Exercise – Schedule a report using user console, Create model using database dimensional modeling techniques, create a Star Schema for querying large data sets, Use fact tables and dimensions tables, manage Pentaho artifacts
Designing data models for reporting, Pentaho support for predictive analytics, Design a Streamlined Data Refinery (SDR) solution for a client
Hands-on Exercise – Design data models for reporting, Perform predictive analytics on a data set, design a Streamlined Data Refinery (SDR) solution for a dummy client
Understanding the basics of clustering in Pentaho Data Integration, creating a database connection, moving a CSV file input to table output and Microsoft Excel output, moving from Excel to data grid and log.
Hands-on Exercise – Create a database connection, move a csv file input to table output and Microsoft excel output, move data from excel to data grid and log
The Pentaho Data Integration Transformation steps, adding sequence, understanding calculator, Penthao number range, string replace, selecting field value, sorting and splitting rows, string operation, unique row and value mapper, Usage of metadata injection
Hands-on Exercise – Practice various steps to perform data integration transformation, add sequence, use calculator, work on number range, selecting field value, sorting and splitting rows, string operation, unique row and value mapper, use metadata injection
Working with secure socket command, Pentaho null value and error handling, Pentaho mail, row filter and priorities stream.
Hands-on Exercise – Work with secure socket command, Handle null values in the data, perform error handling, send email, get row filtered data, set stream priorities
Understanding Slowly Changing Dimensions, making ETL dynamic, dynamic transformation, creating folders, scripting, bulk loading, file management, working with Pentaho file transfer, Repository, XML, Utility and File encryption.
Hands-on Exercise – Make ETL dynamic transformation, create folders, write scripts, load bulk data, perform file management ops, work with Pentaho file transfer, XML utility and File encryption
Creating dynamic ETL, passing variable and value from job to transformation, deploying parameter with transformation, importance of Repository in Pentaho, database connection, environmental variable and repository import.
Hands-on Exercise – Create dynamic ETL, pass variable and value from job to transformation, deploy parameter with transformation, connect to a database, set pentaho environmental variables, import a repository in the pentaho workspace
Working with Pentaho dashboard and Report, effect of row bending, designing a report, working with Pentaho Server, creation of line, bar and pie chart in Pentaho, How to achieve localization in reports
Hands-on Exercise – Create Pentaho dashboard and report, check effect of row bending, design a report, work with Pentaho Server, create line, bar and pie chart in Pentaho, Implement localization in a report
Working with Pentaho Dashboard, passing parameters in Report and Dashboard, drill-down of Report, deploying Cubes for report creation, working with Excel sheet, Pentaho data integration for report creation.
Hands-on Exercise – Pass parameters in Report and Dashboard, deploy Cubes for report creation, drill-down in report to understand the entries, import data from an excel sheet, Perform data integration for report creation
What is a Cube? Creation and benefit of Cube, working with Cube, Report and Dashboard creation with Cube.
Hands-on Exercise – Create a Cube, create report and dashboard with Cube
Understanding the basics of Multi Dimensional Expression (MDX), basics of MDX, understanding Tuple, its implicit dimensions, MDX sets, level, members, dimensions referencing, hierarchical navigation, and meta data.
Hands-on Exercise – Work with MDX, Use MDX sets, level, members, dimensions referencing, hierarchical navigation, and meta data
Pentaho analytics for discovering, blending various data types and sizes, including advanced analytics for visualizing data across multiple dimensions, extending Analyzer functionality, embedding BA server reports, Pentaho REST APIs
Hands-on Exercise – Blend various data types and sizes, Perform advanced analytics for visualizing data across multiple dimensions, Embed BA server report
Knowledge of the PDI steps used to create an ETL job, Describing the PDI / Kettle steps to create an ETL transformation, Describing the use of property files
Hands-on Exercise – Create an ETL transformation using PDI / Kettle steps, Use property files
Deploying ETL capabilities for working on the Hadoop ecosystem, integrating with HDFS and moving data from local file to distributed file system, deploying Apache Hive, designing MapReduce jobs, complete Hadoop integration with ETL tool.
Hands-on Exercise – Deploy ETL capabilities for working on the Hadoop ecosystem, Integrate with HDFS and move data from local file to distributed file system, deploy Apache Hive, design MapReduce jobs
Creating interactive dashboards for visualizing highly graphical representation of data for improving key business performance.
Hands-on Exercise – Create interactive dashboards for visualizing graphical representation of data
Managing BA server logging, tuning Pentaho reports, monitoring the performance of a job or a transformation, Auditing in Pentaho
Hands-on Exercise – Manage logging in BA server, Fine tune Pentaho report, Monitor the performance of an ETL job
Integrating user security with other enterprise systems, Extending BA server content security, Securing data, Pentaho’s support for multi-tenancy, Using Kerberos with Pentaho
Hands-on Exercise – Configure security settings to implement high level security
Please enter your details for more info
Severity: Notice
Message: Undefined index: geoplugin_countryName
Filename: include/footer.php
Line Number: 12
Backtrace:
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/views/include/footer.php
Line: 12
Function: _error_handler
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/views/CourseDetails.php
Line: 2276
Function: include
File: /home/laxmiveena/public_html/test.hachtechnologies.com/application/controllers/Hachion.php
Line: 510
Function: view
File: /home/laxmiveena/public_html/test.hachtechnologies.com/index.php
Line: 315
Function: require_once