Hello, I wonder what could be the reason that I get access denied to the Search UI (Local security realm) as the normal admin user? I can successfully access the management UI with the same credentials. Have closed and reopened the browser already. Used a different browser too. But always the same issue. Can somebody please explain why… Show more
Hi All I am trying to grab an auth token via the REST API and constantly get a 401 Unauthorised error regardless of if I am on the local machine or another client. Any ideas why this might be occurring? Request: curl -ik -X POST https://IPADDRESS:8000/auth/oauth/ \ -d grant_type=password \ -d username=admin \ -d password=APASSWORD \ -d… Show more
How to modify a target step's row meta structure from the code of my UDJC step (which is the source step)?
Hi there, In my UDJC code, I have retrieved the target step meta like this: StepMeta myTargetStepMeta = getTransMeta().findStep(getParameter("MY_TARGET_STEPNAME")); There is a hop between my UDJC step (source step) and the target step (myTargetStepMeta). In the beginning of the processRow() method's code in my UDJC step, I am… Show more
Hi, I'm supposing that you didn't added the new fields to the Fields tab as explained here User Defined Java Class - Pentaho Data Integration - Pentaho Wiki ? Do you need to add the new fields dynamically in the code? One other thing, not directly related to your issue, but if you have a hop from your UDJC to the "target step", you should…
hi, i have a cluster HCPAW vm with version 220.127.116.11 HF0010. I try to upgrade to version 18.104.22.168 and during upgrade the precheck failed. it showed as below event ID: 10044 Cannot perform upgrade. Failed to execute /home/install/anywhere-genesis/package/opt/anywhere/sil/bin/mapicli need help to overcome this issue
Currently looking to retrieve information on the last date on which each file has been accessed (as opposed to simply modified). Any recommendations on current schemas, alone or in combination, would be welcome.
We are looking to the aggregate volume of data under an index using the API Search function offered by the HCI. This involves leveraging the stats component offered by the Solr Index via the query function (The Stats Component | Apache Solr Reference Guide 7.0 ). The script below returns the filtered items (i.e. there is no error) but does not… Show more
When I run spoon on Fedora 27 I get following error: WARNING: no libwebkitgtk-1.0 detected, some features will be unavailable Consider installing the package with apt-get or yum. e.g. 'sudo apt-get install libwebkitgtk-1.0-0' ####################################################################### OpenJDK 64-Bit Server VM warning:… Show more
I am trying to evaluation PDI with a hybrid cloud. I would like minimal install on an on prem server (gateway) and to run PDI on a cloud machine (Azure). I want to update data sets and transformations there along with scheduling versus having to update the on prem server. I cannot seem to find documentation on this.
IHAC that is processing e-mails. These e-mails can have large attachments (more than 1MB). The desire is to index both the e-mail and the attachments. I do understand that there is a stage that allows to break apart e-mails into multiple documents for processing, however, the customer needs one e-mail (attachments included) to be equal to 1 Solr… Show more
We have more than 1 million objects which failed our of 250,000,000 indexed objects, partially because previously I rejected all objects larger than 20MB. I have now removed that constraint. I recently ran the re-try failure task (first time) and it halted with the following message: java.lang.InterruptedException: sleep interrupted at… Show more
As getting more into performance issues, started to wonder about whether HCI takes special consideration with respect to distributing node connections on HCP. With a Load Balancer in front of HCP, HCI connection requests to HCP would be properly load balanced. However, if a load balancer is not used, does HCI have the same issues with HCP… Show more
Load more items
Wondering if the new version of HCI could help with this scenario. We have ~40PBs of Hadoop data that needs to be copied and moved to a second data center (not using WAN) and copied in to a new Hadoop cluster I was thinking: Use new release of HCI to copy data from source HDFS to local HCP Super-scale individual HCI processes for crawling,… Show more