You are on page 1of 18

***********StartODCInstallandconfigure*************************** ALLPASSWORDSAREwelcome1exceptforClouderaManagerwebloginwhichisadmin/admin InstallOracleDirectConnector: Startall5images,loginwithwelcome1 OnDW Inaterminal su useradds/sbin/nologinMhadoop then cd/usr/local then tarxzvf/home/oracle/Downloads/hadoop0.20.2cdh3u4.tar.gz then chownRhadoop:hadoophadoop0.20.

2cdh3u4/ thenopenawebbrowseronDWandgoto http://ons1:7180 loginasadmin/admin downloadtheclientconfig interminal cd/home/oracle/Downloads unzipthefileyoujustdownloaded unzip then cdhadoopconf copytheconffilesto/usr/local/hadoop.../confoverwritingtheoriginalfiles thenstillasrootgointothatdestination

tabcomplete/usr/java/jdk1.6.0_32/(copyitintotheclipboard)then gedithadoopenv.sh andpasteinthejdkpathwithouttrailingslash saveandexitthen cd../bin ./hadoopfsls/user asrootcreatetheinstalldirectoryforodc mkdirp/opt/odc then chownRoracle:oracle/opt/odc thenexitintooracleuserinterminal cdinto/opt/odc then unzip/home/oracle/Downloads/orahdfs1.0.0.0.0.zip thencdintotheneworahdfs...directoryandpwdcopythatdirectory then cdbin/ then gedithdfs_stream& pasteintoDIRECTHDFS_HOMEwhatyoucopiedandthenpastein /usr/local/hadoop0.20.2cdh3u4 intotheHadoop_homevariable Thensaveexit,andstillastheoracleuserintheterminalexecutethatscriptshouldonlygetusageinfo ./hdfs_stream

Now su then mkdirp/scratch/sales_ext_dir chownRoracle:oracle/scratch/sales_ext_dir/ chownRoracle:oracle/scratch chmod764R/scratch/ Exitinterminaltotheoracleuser gointosqlplusassysdbathen CREATEORREPLACEDIRECTORYsales_ext_dirAS'/scratch/sales_ext_dir'; CREATEORREPLACEDIRECTORYhdfs_bin_pathAS'/opt/odc/orahdfs1.0.0.0.0/bin'; CREATEUSERHDFSUSERIDENTIFIEDBYwelcome1; GRANTCREATESESSION,RESOURCETOhdfsuser; GRANTEXECUTEONSYS.UTL_FILETOhdfsuser; GRANTREAD,WRITEONDIRECTORYsales_ext_dirTOhdfsuser; GRANTEXECUTEONDIRECTORYhdfs_bin_pathTOhdfsuser; ***********FinishedODCInstallandconfigure*************************** ***********StartOLHInstall*************************** Gotonode2 openterminaland su

mkdirp/opt/olh chownRDIS:DIS/opt/olh exitintoDISuser cd/opt/olh then unzip/home/DIS/Downloads/oraloader1.1.0.0.1.x86_64.zip then cdoraloader1.1.0.0.1/jlib/ then pwdcopytheresult(/opt/olh/oraloader1.1.0.0.1/jlib) then su then gedit/etc/hadoop/conf/hadoopenv.sh& pasteinthepwdcopythatyoudidintotheuncommentedhadoopclasspathdonotforgetthe/* shouldlooklike exportHADOOP_CLASSPATH="/opt/olh/oraloader1.1.0.0.1/jlib/*:$HADOOP_CLASSPATH" saveandexit interminalexitintoDISuser thentesttheclasspathwith hadoopclasspath andcheckthatyouseethenewsettings ***********FinishOLHInstall***************************

ODInothingtoinstallconfigureitlater ***********StartRInstall*************************** onall4nodes(nottheDW)ensurethatyoucangettotheinternetinChrome gotonode1 Setupa4tabterminal Intabs24 sshroot@ons2(andthesameforons3andons4) ensurethatyouarerootinthelocalterminal(1) ineachofthe4terminalsrepeatthesesteps(asroot): cd/opt tarxzvf/home/DIS/Downloads/R2.13.2.tar.gz chownRDIS:DIS/opt/R2.13.2 suDIS cd/opt/R2.13.2/ ./configure make thencopyovertheOracleRpluginasDISuser cp/home/DIS/Downloads/orhc.tgzlibrary/ then cdbin/ theninthecommandline exportHADOOP_HOME=/usr/lib/hadoop

then ./RCMDINSTALL/opt/R2.13.2/library/orhc.tgz ***********DoneRInstall*************************** ***********UseODC*************************** Gotonode2 OpenbrowserandgotothehuelinkloginasDISwelcome1ifneeded createanhdfsdirectorycalledodctestunder/user/DIS uploadthetwocsvfilesfromthelocalfs'sDownloadsdirectoryinspectthefiles GototheDWimageandgointosqlplusassysdba InFileBrowser(iconisonthedesktop)openandinspectcreate_ext_table.sqlinthe/home/oracle/ directory assysdbaexecute @/home/oracle/create_ext_table.sql inspectinsqldeveloperthatnewtable Thenbackinsysdbatry selectcount(*)from"HDFSUSER"."test_data_ext_table"; shouldgetanerrorbecausewehavenottiedtheexttableintotheclusterandwhichhdfsfilestopoint to todothis,hadoopwillneedtointrospecttheexternaltableinthedbsowillthereforeneedaccessto theojdbc6.jarfile soasroot

gedit/usr/local/hadoop0.20.2cdh3u4/conf/hadoopenv.sh andintheclasspathpastein/opt/app/product/11.2.0/dbhome_1/jdbc/lib/ojdbc6.jar soitlookslike #ExtraJavaCLASSPATHelements.Optional.


exportHADOOP_CLASSPATH="/opt/app/product/11.2.0/dbhome_1/jdbc/lib/ojdbc6.jar:$HADOOP_CLASSPATH"

saveandexit Thenastheoracleuserinspectodc_properties.xml andastheoracleuserexecute /usr/local/hadoop0.20.2cdh3u4/bin/hadoopjar/opt/odc/orahdfs1.0.0.0.0/jlib/orahdfs.jar oracle.hadoop.hdfs.exttab.ExternalTableconf/home/oracle/odc_properties.xmlpublish enterwelcome1asthepasswordandyoushouldseesuccess nowgobackintothesysdbaandrepeatthequerythatfailed selectcount(*)from"HDFSUSER"."test_data_ext_table"; then select*fromv$pq_sesstat;noparallel then select/*+parallel(a,2)*/count(*)from"HDFSUSER"."test_data_ext_table"a; then select*fromv$pq_sesstat; Inspectcontentsofscratchdirectoryonthefs ***********EndUseODC***************************

***********BeginconfiganduseOLH*************************** onDWcreateapartitionedtableforthetestcase inspect/home/oracle/create_partitioned_table_for_olh.sql andexecuteitassysdba @/home/oracle/create_partitioned_table_for_olh.sql Gotonode2 OpenthehuelinkinChromeandastheDIShueusercreateadirectoryinthe/user/DIScalledthings andcopyin/home/DIS/Downloads/items.txt TheninBeeswaxcreateathingstableagainstthatfilewith fieldnames itemno(int) itemname(string) price(double) loc(string) inspect/home/DISmyConf.xmlandloaderMap.xml ToaccommodateanOCIoutput(asopposedtojdbcordp/sqlloaderexttable)weneedtoupdatethe unixenv_variablestoincludetheinstantclientembeddedwithintheolhbinaries asroot gedit/etc/profile add exportLD_LIBRARY_PATH=/opt/olh/oraloader1.1.0.0.1/lib exportJAVA_LIBRARY_PATH=/opt/olh/oraloader1.1.0.0.1/lib saveandexitandtestitwith ./etc/profile reboot

whilethatisrebootingyoucangotonode1asDISandstartupthehiveserver(thriftserver)with hiveservicehiveserver makesureyoudidthatintheons1tabifyoustillhavethemultitabsessionsopen checkwithanetstattoseethatitisrunningon10000 netstatntlp|grep10000 gobacktonode2 cd/opt/olh/oraloader1.1.0.0.1/jlib/ you'llneedtocopyoversomejarsfromhiveforthisjobsinceitisgoingagainstahivetableastheinput format from/usr/lib/hivecopyinthefollowingfiles hiveexec0.7.1cdh3u4.jar hivemetastore0.7.1cdh3u4.jar hiveserde0.7.1cdh3u4.jar thrift0.5.0.jar thriftfb3030.5.0.jar reviewthefollowinghadoopjobcommand,thenexecuteit hadoopjaroraloader.jaroracle.hadoop.loader.OraLoaderconf/home/DIS/myConf.xmllibjars /usr/lib/hive/lib/hivemetastore0.7.1cdh3u4.jar,/usr/lib/hive/lib/hiveserde0.7.1 cdh3u4.jar,/usr/lib/hive/lib/hiveexec0.7.1 cdh3u4.jar,/usr/lib/hive/lib/libthrift.jar,/usr/lib/hive/lib/thriftfb3030.5.0.jar inspectthetargetorcltable,reviewmapreducejobdetails,reviewthe/user/DIScontent Sothatwasahivesourcetable,buttherearetimeswhenyouwillwanttoloadadelimitedtextfile (anythingelseneedsacustomjavaimplementationusingtheexampleswehaveanditsjavadocs) Wewillnowdoadelimitedtextexampleusingthesecondcannedinputformat

Truncatethetargettableinsqldeveloper Sincetheactofcreatingthethingshivetablewasamanagedtableevent,theoriginalitems.txtisgone solet'suploaditagainfromthelocalfilesystem(Downloads)intothehdfs/user/DIS/thingsdirectory, wewillnotbeusingthatforahivetable,justpointingtoitfromthejob reviewloaderTextMap.xmlandmyConfText.xml thenstillinthejlibdirectorywecaninspectandexecute hadoopjaroraloader.jaroracle.hadoop.loader.OraLoaderconf/home/DIS/myConfText.xml (nolibjarssincewe'renotinvokinga"3rdpartytechnology"likehivehereallthelibjarsin myConText.xmlaresufficienttogetthisjobdone discuss: globalindexesnotallowedfordirectpath(OCI)loadoftablepartitionHDFSUSER.HADOOP_TABLE ***********EndconfiganduseOLH*************************** **************StartODIConfigandUse************************************** Tostarttheconfigyouwillneedtoaddsomelibrariestothestudioaswellasitsembeddeddesigntime agentsothatitcanspeaktohadoop,hiveandolh so... inacmdlineonnode2asDIS cd/home/DIS/.odi/oracledi/userlib/ then geditadditional_path.txt andadd /usr/lib/hive/lib/*.jar /usr/lib/hadoop/hadoop*core*.jar /usr/lib/hadoop/hadoop*tools*.jar

thenforthestandaloneagent.. cd/opt/middleware/ODI/oracledi/agent/drivers/ and cp/usr/lib/hive/lib/*.jar. cp/usr/lib/hadoop/hadoop*core*.jar. cp/usr/lib/hadoop/hadoop*tools*.jar. Furtheredit/etc/profile su gedit/etc/profile pasteinrightaboveexportPATHline exportODI_HIVE_SESSION_JARS=/usr/lib/hive/lib/hivecontrib0.7.1cdh3u4.jar exportOLH_HOME=/opt/olh/oraloader1.1.0.0.1 exportHADOOP_HOME=/usr/lib/Hadoop export ODI_OLH_SHAREDLIBS=$OLH_HOME/lib/libolh11.so,$OLH_HOME/lib/libclntsh.so.11.1,$OLH_HOME/lib /libnnz11.so,$OLH_HOME/lib/libociei.so export ODI_OLH_JARS=$OLH_HOME/jlib/ojdbc6.jar,$OLH_HOME/jlib/orai18n.jar,$OLH_HOME/jlib/orai18n utility.jar,$OLH_HOME/jlib/orai18nmapping.jar,$OLH_HOME/jlib/orai18n collation.jar,$OLH_HOME/jlib/oraclepki.jar,$OLH_HOME/jlib/osdt_cert.jar,$OLH_HOME/jlib/osdt_core.j ar,$OLH_HOME/jlib/commonsmath2.2.jar,$OLH_HOME/jlib/jacksoncoreasl 1.5.2.jar,$OLH_HOME/jlib/jacksonmapperasl1.5.2.jar,$OLH_HOME/jlib/avro 1.5.4.jar,$OLH_HOME/jlib/avromapred1.5.4.jar,$OLH_HOME/jlib/oraloader.jar,/usr/lib/hive/lib/hive metastore0.7.1 cdh3u4.jar,/usr/lib/hive/lib/libthrift.jar,/usr/lib/hive/lib/libfb303.jar,/usr/lib/hive/lib/hivecommon 0.7.1cdh3u4.jar,/usr/lib/hive/lib/hiveexec0.7.1cdh3u4.jar saveexitandsourcetotest

./etc/profile thenstillasroot gedit/etc/hadoop/conf/hadoopenv.sh& pasteintotheHADOOP_CLASSPATH /usr/lib/hive/lib/hivemetastore0.7.1 cdh3u4.jar:/usr/lib/hive/lib/libthrift.jar:/usr/lib/hive/lib/libfb303.jar:/usr/lib/hive/lib/hivecommon 0.7.1cdh3u4.jar:/usr/lib/hive/lib/hiveexec0.7.1cdh3u4.jar: thefinalresultlookslike: exportHADOOP_CLASSPATH="/opt/olh/oraloader1.1.0.0.1/jlib/*:/usr/lib/hive/lib/hivemetastore 0.7.1cdh3u4.jar:/usr/lib/hive/lib/libthrift.jar:/usr/lib/hive/lib/libfb303.jar:/usr/lib/hive/lib/hive common0.7.1cdh3u4.jar:/usr/lib/hive/lib/hiveexec0.7.1cdh3u4.jar:$HADOOP_CLASSPATH" ***warningthatthereisanentrytherefromourstandaloneolhexperimentthatisstillneededbyODI saveandexitandthentoseeiftheyshowuptype hadoopclasspath reboot Inhuein/user/DIScreatethesalesdirectory uploadfromnode2's/home/DIS/booksource/input/hive/joins/sales.txtandthings.txt(rename things.txttoproducts.txt) inthecommandlineastheDISusercopyoverreplacethexmlreffolder cd/opt/middleware/ODI/oracledi/xmlreference/ then

cp/home/DIS/Downloads/xmlreference/*. mightwanttohopovertonode1andctrlcthethriftserverandrestartit StarttheStudioonthedesktop importtheHiveTechandthenallthekm'sintotheglobalbucket CreateanewFileDataServercalledHDFS jdbcofhdfs://ons1.DISdemo.comno8020 Thencreateaphysicalschemacalled/user/DIS/sales/ save createalogicalschemacallitHDFS_sales NowcreatetheHiveDataServerwithhivedriverand jdbc:hive://ons1.DISdemo.com:10000/default andinFlexfielddeselectdefaultandpastein thrift://ons1.DISdemo.com:10000 ***makesureittakespasteinvaluethenclickdefaultsave,closeopenunchecketcetc. testtheconnectionagainstthelocalagent,thenyoumightaswellstartthestandaloneandtestthat gointocmdlineinanewwindowastheDISuserand startodiagent thentestthehiveconnectionagain createaphysicalschemawithdefaultasthedbname NowcreatealogicalcalledHIVE

WorkingwiththeKM's RKMHiveandFile(regularorhdfsfile)toHive GointoDesigner/Models CreateanewFilemodelcalledHDFSsalesandpointittotheLogschemaHDFS_sales CreateadatastoreunderneathitandcallitSALES.txtandanaliasofSALES_txtandaresourceof sales.txt filestab:delimitedfile,0header,tabdelimiter columnstab: name(string)1212 itemid(numeric)1212 save Thenaddanotherdatastorecalleditproducts.txtaliasPRODUCTS_TXTresourceisproducts.txt tabdelimited itemid(numeric)1212 itemname(string)1212 CreateanewHivemodelfortheHivetablethatyouhave(things)callitHiveTables,techishive Hiverkmit(selectcustomized) Create2newdatastoresinhivemanuallyonecorrespondingtosales.txtandanothertoproducts.txt manually Forsalestable username(string)50 itemno(int)12

noflexfieldsettings Andforproductshivetable itemnum(int)12 prodname(string)50 CreateanewProjectFoldercalledBigDataIntegration Nowcreatethe'Loadsales.txthdfsfileintosaleshivetable'interface Selectmapping,draginsaleshivetableintotargetandsales.txtintosource,mapthecols,gobackto overviewandselectstagingareaofHDFS_sales Clickonflow,thentargetandthenselecttheikmfiletohive inoptionschangefromtheirdefaults createtargetyes fileislocalfalse stagingtabletrue overriderowformat: ROWFORMATDELIMITEDFIELDSTERMINATEDBY'\t'LINESTERMINATEDBY'\n'STOREDASTextFile Testittoseeifitworks Thencreatethe'Loadproducts.txthdfsfileintoproductshivetable'interfaceusingsameapproachas above testtoseeifitworks Nowlet'sdoajoinbutfirstweneedtocreatethehivetargettablemanually Underhivemodelwe'llcreatethefullouterjointargettableprodsales cols:

user(string)50 itemnumber(int)12 productname(string)50 Thencreateinterfacecalled'Joinsaleshivetablewithproductshivetableintoprodsaleshivetable' Addthetwosources(theresultinghivetablefromtheprevious2interfaces)andthenthetargetand mapandjoin,selectorderedfullouterinpropertiestheninflowtheonlyrealchoiceisHive ctrlAppend ikmoptions createtargettrueallelsedefault noteyoumightgetanerrorinthejobaboutnotbeingabletofetchatablecheckorrestartthe hiveserviceonnode1andtryagain Thentrytheolhloadikm Createanoracledatabasepartitionedtablecalledproductsales gointodwandinsysdbaexecute @/home/oracle/Create_partitioned_ProdSales.sql Thengothroughtheprocessofsettinguptheoracledatabaseinodi createadataservercalledOracleDWwith jdbc:oracle:thin:@//dw.DISdemo.com:1521/orcl.DISdemo.com(thisconnectionstyleismandatory becausewewillbeusingOLHinODIwithOLHsDirectPathOCIoption) connectivityisHDFSUSER/welcome1testtheconnection CreateaphysicalschemawithHDFSUSER then CreatealogicalschemacalledDW CreateamodelcalledORCLDW TheninReverseEngineering,selectcustom,selectOrclrkmandreverseengineerjustPR%(remove characterstoremovefromalias)

Createnewinterface'HiveprodsalesloadtoDWwithOLH' Insourceusethehiveprodsalestablefromtheprevious3interfaces SelectHIVEasstagingareadifferentfromtargetintheinterfacesOverviewtab Usestraightmappings InFlowselectIKMOLHwith.. outputOCI createtargetfalse truncateanddeletealltrue stagingtablehivetrue oraclestagingtablefalse mapredoutputfolder/user/DIS/odiolhoutput deletetemptrue Runitanditshouldwork Forgrinsandgiggleswecanpackageeverythingtogether butweneedtodropallthehivetablesinhue/beeswax,truncatethetargetOrcltableandthenupload intosalesproducts.txtandsales.txt Thencreatethepackagewiththe4interfaces CallitEndtoEndETL

You might also like