Translate

Monday, 10 December 2012

Shahed

OBIEE Upload a CSV File and Process into a Database with Linux CRON Job

OBIEE 11g Upload a CSV file and Process with Linux CRON Job

By Krishna Mohan & Shahed Munir



This was just a basic requirement from one of our clients to upload template csv files with fixed layouts but different saved names using OBIEE and insert into a database table.

Maybe there are better ways. This is the one we were comfortable with on a Linux x64 platform running obiee 11.1.1.6.5. This is just an overview on how this can be achieved . The solution below works but will need customization based on your requirements. Some code changes will be required also to fit your immediate needs.

No Java Needed.

Uploading a File

Step 1 : Goto catalog on toolbar

















Step 2 : create a folder to upload file : "We created folder called upload .. call it what you want"


Step 3 as seen in image there is an upload button available to upload files.


Step4: click the upload button and upload a file

The file is uploaded into the catalog directory under the folder you created the only difference in the filename will be the . will be replaced by %2e

Contents of our file is

Date,Pet
29-Feb-12,Dog
01-Mar-12,Cat

Our solution was they can upload multiple files with the same layout within OBIEE and they will get processed into the same table.



The upload functionality button can be embedded into a dashboard
Insert Link or Image : Caption:  Upload a File

URL

saw.dll?catalog#%7B%22location%22%3A%22%2Fshared%2Fupload%22%7D

The above URL will go directly to the upload folder.

Viewing the File

You can either view the file in the catalog by double clicking it or call the file to download in a dashboard

Just insert a Link or image onto dashboard as seen in the screenshot and use the following html link , " We uploaded a file called products.csv"



saw.dll?downloadFile&path=/shared/upload/products.csv

Amend the folder name and the file name that has been uploaded to the name of your folder and file name.


Processing a xls file Converting to csv and inserting into a database.

1. Get a connection for SQLPLUS on Linux Box

If you have an oracle home instance installed on your linux box then you will already have access to sqlplus commands. If not then you can install a utility rpm called oracle toolkit

oratoolkit-1.0.2.1.5-1.noarch.rpm
 
- This one is for Linux x64 instance we have . You can find the relevant toolkit needed so that the sqlplus command will run on your linux instance.

1. Create a Linux script to Process CSV File



# We have not yet included the mv command to move a file from ftp to archive. In your case it will be the catalog folder to an archive directory

Create a shell script .sh file and paste the below change parameters and locations and run we called our file schedule1.sh. Remember to create a table in the DB aswell. Or you can alter the script below to read the csv and create a table.

---------------------------------------------------------------------Start of File

#!/bin/ksh export LD_LIBRARY_PATH=/usr/lib/oracle/11.2/client64/libexport TNS_ADMIN=/MiddlewareHome/Oracle_BI1/network/adminexport
ORACLE_HOME=/usr/lib/oracle/11.2/client64PATH=$PATH:/usr/lib/oracle/11.2/client64/binexport PATH# Search folder will be your obiee catalog folder where you upload files
SEARCH_FOLDER
="/oracle/data/load/ftp/*"SQL_FOLDER="/oracle/data/load/sql/"SQL_ARCHIVE="/oracle/data/load/archive/."MASTER_LOG="/oracle/data/load/masterlog.log"# The master log can be shown in OBIEE just change location to a catalog location.
conn_string
="myuser/mypassword@mydatabase"insert_cnt=0time_stamp=`date +"%m%d%y%H%M%S"` 
# Loop files in Search Folder and the name of the file can be different. 
for f in $SEARCH_FOLDERdoif [ -d "$f" ]
thenfor ff in $f/*do
echo " KM Processing $ff"
done
else
echo "Processing file $f"
 
output_file=$SQL_FOLDER`date +"%m%d%y%H%M%S"`_${f##/*/
}.sqlecho "------------------------------------------------------------------------------------------Start --------" >> ${MASTER_LOG}echo ${time_stamp}
>> ${MASTER_LOG}echo ${output_file}
>> ${MASTER_LOG}awk
-F',' 'NR>1 { print "insert into pets (unique_id,date_id,pet_type) values('"$time_stamp"',\x27"$1"\x27,\x27"$2"\x27);" }' $f > ${output_file}echo
"commit;" >> ${output_file}echo
"exit" >> ${output_file}sqlplus -s myuser/mypassword@mydatabase @${output_file} >> ${MASTER_LOG};fi
 
echo
"------------------------------------------------------------------------------------------END --------" >> ${MASTER_LOG} 
done


---------------------------------------------------------------------End of File

Execute the script to insert records from csv into a table.

Once executed insert scripts are created and then sqlplus command runs the scripts to insert into a database table of your choice.

Remember the script can be customized for various scenarios and additional checks / validations.

To schedule the Shell script to run at certain intervals and pick up any files that have been uploaded by users. We called our .sh script schedule1.sh-


Schedule schedule1.sh to run everyday at 5.30pm using linux

cron is a Linux system process that will execute a program at a preset time

To create a new schedule use the following command

crontab –e

When the editor window opens, create a line like below to run the shell script schedule1.sh that is in the directory patch /oracle/data at 5.30 PM every day

30 17 * * * /oracle/data/schedule1.sh

Once saved, you can see the following message on the screen
 










The syntax for entering the jobs is

[min] [hour] [day of month] [month] [day of week] [program to be run]

10 14 * * 1 /oracle/data/schedule1.sh

Will run /oracle/data/schedule1.sh at 2:10pm on every Monday

At any point to know which jobs are scheduled to be run, use the following command

crontab -l

This command shows the following single scheduled job


 




To stop a scheduled run at a particular time edit the crontab with crontab –e command and remove the line and save the file


To know more about the options available use the man page as

man crontab

Over and out .. Cheers Krishna and Shahed
 


Note: only a member of this blog may post a comment.

Blog Archive



About Authors

Shahed Munir

Krishna Udathu

Shahed and Krishna are Oracle / Big Data Experts