Title
Create new category
Edit page index title
Edit category
Edit link
How to Clear Hadoop/HDFS Cache for Infoworks Jars
Issue
Clearing Hadoop/HDFS cache for jars packaged from Infoworks.
Solution
- Login to the Infoworks edge node (the machine where Infoworks is installed).
- Run the below commands:
cd $IW_HOME/bin
source env.sh
./stop.sh hangman
cd $IW_HOME/conf
cat conf.properties | grep hdfs_temp
NOTE: Ensure that no Infoworks jobs are running. You can verify the job queue in the Infoworks ADE Admin page.
- Copy the iw_hdfs_temp_home path displayed in the output. For example, iw_hdfs_temp_home=/user/ec2-user/temp.
- Remove the jars and path from the specified directory using the output from above command as base path.
For example,
hadoop fs -rm -R output_of_step_3 / iwjobs/libjars/opt/infoworks/
For Ingestion and related jars,
hadoop fs -rm -R output_of_step_3 /udfs/opt/infoworks
For UDF and related jars,
hadoop fs -rm /user/ec2-user/temp/iwjobs/libjars/opt/infoworks/
hadoop fs -rm /user/ec2-user/temp/udfs/opt/infoworks/
- Run the following commands:
cd $IW_HOME/bin
sourcce env.sh
./start.sh hangman
- Login to the Infoworks ADE and rerun the job.
For more details, refer to our Knowledge Base and Best Practices!
For help, contact our support team!
(C) 2015-2022 Infoworks.io, Inc. and Confidential