Solutions & Products

Eric Tonissen
We have a carte service running, with 'Table output' to a jdbc database.   We try to catch the JDBC logging : https://jdbc.postgresql.org/documentation/head/logging.html but we were not able to get some logging into any file.   Is there somebody, who can explain us how to do this?
in Data Integration
Kevin Haas
This post originally published by Chris Deptula on Tuesday, October 27, 2015   I recently attended the Strata-HadoopWorld conference in NYC.  I have been attending this conference for the past few years and each year a theme emerges.  A few years ago it was SQL on Hadoop, last year was all Spark.  This year there was a lot of buzz about streaming… (Show more)
in Pentaho
Kevin Haas
This post originally published by Kevin Haas on Tuesday, July 14, 2015   When working with our clients, we find a growing number who regard their customer or transaction data as not just an internally leveraged asset, but one that can enrich their relationships with customers and supporting partners. They need to systematically share data and… (Show more)
in Pentaho
Kevin Haas
This post originally published by Bryan Senseman on Wednesday, October 15, 2014   I'm a huge user of Mondrian, the high speed open source OLAP engine behind Pentaho Analyzer, Saiku, and more. While the core Mondrian engine is wonderful, there are times when it doesn't do what I need it to, or exactly what I expect it to. Take this special case… (Show more)
in Pentaho
Kevin Haas
This post originally published by Chris Deptula on Wednesday, November 19, 2014.   Many of you requested more information on the inner workings of the Sqoop component. Perhaps the best way to explain is via "lessons learned". Here goes...   Use the split-by Option Sqoop is primarily used to extract and import data from databases into HDFS and… (Show more)
in Pentaho
Kevin Haas
This post originally published by Chris Deptula on Tuesday, February 24, 2015.   This is the third in a three part blog on working with small files in Hadoop.   In my previous blogs, we defined what constitutes a small file and why Hadoop prefers fewer, larger files. We then elaborated on the specific issues that small files cause, specifically… (Show more)
in Pentaho
Kevin Haas
This post originally published by Chris Deptula on Wednesday, February 18, 2015   This is the second in a three part blog on working with small files in Hadoop. In my first blog, I discussed what constitutes a small file and why Hadoop has problems with small files. I defined a small file as any file smaller than 75% of the Hadoop block size, and… (Show more)
in Pentaho
Kevin Haas
This post published by Chris Deptula on Wednesday, February 11, 2015   This is the first in a 3 part blog on working with small files in Hadoop. Hadoop does not work well with lots of small files and instead wants fewer large files. This is probably a statement you have heard before. But, why does Hadoop have a problem with large numbers of small… (Show more)
in Pentaho
Nikola Garafolic
Using any version of PDI between 7.1.0.3 to 8.0.0.0, I am unable to use anymore, because of sudden error:   INFO: Setting the server's publish address to be /browser 2017/12/01 12:36:55 - org.pentaho.di.ui.util.EnvironmentUtils@54f4a7f0 - ERROR (version 8.0.0.0-1, build 8.0.0.0-1 from 2017-10-26 08.39.26 by nikola) : Could not open a browser… (Show more)
in Data Integration
Pedro Goncalves
Dashboards Admin Development Data CDE - Dashboard Editor CDF - Dashboard Framework CCC - Chart Components CGG - Graphics Generator CDC - Distributed Cache CDV - Data Validation CFR - File Repository CST - Startup Tabs CBF2 - Build Framework CTE - Text Editor App Builder CDA - Data Access CDB - Data Browser CDG - Data Generator…
in Pentaho
B241Q6DO
Hi all,   I want to do something very simple: I get the a date from my database as Integer in the format yyyyMMdd e.g. 20171213 for today's date. Now I want to convert that to a real date with the Select Values step and the meta data tab. But for some reason the result is always 19700101 (see attached transformation)   I must be missing… (Show more)
in Data Integration
Rico de Feijter
UPDATE -------------------------------------------------------------------------------------------------------------------------------------------   I now changed my workflow to the following:   However, only one row is inserted while the query in table input returns 51 rows. Could someone help me out?   /UPDATE… (Show more)
in Pentaho
Kevin Haas
This post was written by Dave Reinke and originally published on Wednesday, June 22, 2016   In a previous blog, we discussed the importance of tuning data lookups within Pentaho Data Integration (PDI) transformations.  We suggested that there were identifiable design patterns which can be exploited to improve the performance and stability of our… (Show more)
in Pentaho
Load more items