Posts

Showing posts from September, 2014

dsadmin Command for administering Projects

dsadmin   is the command used for administering projects and has large range of options. Command Syntax: dsadmin [-file <file> <domain> <DataStage server> | -domain <domain> -user <user> -password <password> -server <DataStage server>] <primary command> [<arguments>] Valid primary command options are: -createproject -deleteproject -oshvisible -enablercp -enablejobadmin -envadd -envdelete -envset -advancedruntime -basedirectory -deploymentdirectory -customdeployment -listprojects -listproperties -listenv -enablegeneratexml -protectproject FileFormat: Format of the file entry should be like : domain,server,user,password and the dsadmin command will be like : dsadmin -file filename domain server <command arguments> where domain = Information server domain name Server=Datastage Enginetier Server name Creating Project : dsadmin -file file  domain dsengine -createproject ProjectName [-location ...

WebSphere MQ commands

In this section, we look at the WebSphere MQ commands we need to set up and administer the MQ environment. The following table lists the V6 WebSphere MQ commands:

Ganglia — installation on red hat enterprise linux

Image
Install following rpms using yum [root@ip-10-111-51-54:/etc] rpm -qa | grep ganglia ganglia-gmetad-3.0.7-1.el5 ganglia-3.0.7-1.el5 ganglia-3.0.7-1.el5 ganglia-web-3.0.7-1.el5 ganglia-gmond-3.0.7-1.el5 yum install ganglia ganglia-gmetad ganglia-gmond ganglia-web sudo /usr/bin/passwd ganglia /sbin/chkconfig --levels 345 gmond on [root@ip-10-111-51-54:/etc] /sbin/service gmond start Starting GANGLIA gmond:                                    [  OK  ] /sbin/chkconfig --levels 345 gmetad on [root@ip-10-111-51-54:/etc] /sbin/service gmetad start Starting GANGLIA gmetad:                                   [  OK  ] /usr/sb...

Disable autostart on server reboot for datastage engine and metadata tier

Log on to datastage engine server as root and execute the following command. cd $DSHOME Step1)Source the dsenv file . ./dsenv cd scripts

Installing and Configuring Subversion on CentOS

Image
This document explains how to install, configure, and use subversion with apache on centos 5.8 [root@svn ~]# cat /etc/redhat-release CentOS release 5.8 (Final) [root@svn ~]# uname -r 2.6.18-308.el5

istool command line

istool comand line is available on the client and engine tiers. you can import and export many types of assets,and you can query and delete command metadata. Location of the istool command line: For UNIX or Linux, the istool command framework is located in: installation_directory /Clients/istools/cli, where  installation_directory  is the directory where you installed InfoSphere Information Server. For example,/opt/IBM/InformationServer

Steps to configure Datastage operations console in 8.7

Steps to configure Datastage operations console in 8.7 Step1)Creating the operations database schema Creating the operations database schema for an Oracle database on a UNIX or Linux system Database : oracle11g

Collecting Information Server information

Depending on the product and the Information Server component that you are using, there may be product-specific or component-specific information that you will also need to collect. For the data and files collection tasks, assume the following default folder locations. Actual folder locations may differ:

Ops Console Workload Manager

Windows, AIX, Solarix, Linux, or HP-UX Files and Data to collect Description <IS_HOME>\ Server\DSWLM\logs\* <IS_HOME>\ Server\DSWLM\logs\ wlm.properties All logs and Workload Manager properties <IS_HOME>\ Server\DSWLM\dist\lib\wlm.config.xml Workload Manager configuration file. Check wlm.properties  for possible different location of  wlm.config.xml <IS_HOME>\ Server\DSWLM\start.wlm.log Workload Manager start log

XML Transformation information and files to collect

Windows only Files and Data to collect Description Information Server 8.5 and above - Client only <USER_HOME>\Application Data\Macromedia\Flash Player\Logs\flashlog.txt Log file for the DataStage XML Transformation UI Flex application. (Note this file will only exist when using the debug version of the Adobe Flash Player). <USER_HOME>\ds_logs\xmlui_*.log. Log file for the DataStage XML Transformation UI .NET application. Note: capture of all log files here already covered for main DS client collection. A new log file is created every time the user invokes the DataStage XML Transformation connector application. The file name of a log begins with either "xmlui_assembly", or "xmlui_mdi_importer" followed by a time stamp indicating when the log was created (e.g., xmlui_assembly_2010.10.06_11.35.30.093.log). Tip for troubleshooting: The timestamp used in the log filename xmlui*.log is used to create the client correlation ID (clientid). I...

Balanced Optimization

Windows users Files and Data to collect Description Information Server 8.1 < BALOP_HOME >\ logs\*.* < BALOP_HOME > defaults to  C:\IBM\BalancedOptimization Information Server 8.5 and above < USER_HOME> \Application Data\ IBM\Information Server\DataStage Client\ <ClientTagID>\BalOp\logs\*.* The  ClientTagID  is located in the <IS_HOME>\Version.xml file, for example: <InstallType client="true" clientTagId="652e77e8-ea1e-4850-a8b2-11e6a33dd884" currentVersion="8.5.0.0" document="false" domain="true" engine="true" repository="true"/>

SalesForce Connector

T o provide IBM Support with sufficient information to debug a problem with SalesForce Connector, the following are the  minimal collection requirements  that should be  provided: The exported job Director log Debugging Directory Use the following commands to gather and generate the files described above.

Oracle Connector, Oracle Enterprise Stage, Oracle OCI Plugin, DRS Plugin

Oracle  Enterprise  Stage information If a segmentation fault occurs before the job aborts, set environment variable APT_NO_PM_SIGNAL_HANDLERS equal to 1 and re-run the job. The core file will now be produced and can be found in the project directory. Obtain the stack trace from the core file after sourcing dsenv and send stack trace to IBM Support. To enable Oracle Enterprise trace, define environment variable APT_DEBUG_MODULE_NAMES and set it equal to one or more of the following values: oraread, orawrite, orautils, oralookup. For example, to debug an upstream (reading) Oracle Enterprise stage, define APT_DEBUG_MODULE_NAMES='orautils oraread' Use the following commands to gather and generate the files described above.

ODBC Connector, ODBC Enterprise Stage, ODBC Stage

Use ODBC trace when a problem is encountered with a component that uses ODBC to get at data sources. This includes the following components: ODBC Connector (DataStage Parallel Jobs) ODBC Enterprise Stage (DataStage Parallel Jobs) ODBC Stage (DataStage Server Jobs) Information Analyzer Since collection of ODBC trace can affect performance, enable ODBC trace before you re-run a job or repeat a task. Disable ODBC trace as soon as possible. Problems with metadata import in Information Analyzer If you are collecting ODBC trace because of problems with metadata import in Information Analyzer, restart the ASB Agent service after enabling or disabling the trace.  To do so, follow these instructions:

SAP packs

All platform users Files and Data to collect Description Server (DataStage engine) machine <IS_HOME>/Server/DSEngine/.dsrel DS Version < IS_HOME >/Server/DSSAPbin/.dsSAPPackrel Under Windows collect: < IS_HOME >\Server\DSSAPbin\dsSAPPackrel.txt R3 Pack version (if file exists, then it is a valid R3 Pack installation) < IS_HOME >/Server/DSBWbin/.dsSAPBWPackrel Under Windows collect: < IS_HOME >\Server\DSSAPbin\dsSAPBWPackrel.txt BW Pack version   (if file exists, then it is a valid BW Pack installation). I t is not present in BW Pack 4.3.1(bug) < IS_HOME >/Server/DSEngine/bin/IWSDSPSAPR3*.SYS2 < IS_HOME >/Server/DSEngine/bin/IWSDSPSAPBW*.SYS2 License files showing the installed R3/BW Pack version. Use the files as a flag for an existing installation, for example, IWSDSPSAPR30600.SYS2 → R3 Pack 60 IWSDSPSAPBW0403.SYS2 → BW Pack 43 contents of folder indicated by the environment variable $RFC_TRACE_DIR ...

Information Server auditing files to collect

All platform users Files and Data to collect Description Server machine <WAS_IS_PROFILE> \logs\ISauditLog_0.log <WAS_IS_PROFILE> \logs\ISauditLog_1.log Audit log files (latest two instances) in the default location. See below for modified locations

XMeta Metabrokers and Bridges

All platform users Files and Data to collect Description Services (Domain) machine <IS_HOME>\ASBServer\install\logs\xmeta-install.log xmeta installation log <IS_HOME>\ASBServer\install\etc\history\xmeta-install-status.hist xmeta installation history file <IS_HOME>\ASBServer\install\bin\install.properties xmeta installation configuration file Server (ie ds engine) machine <IS_HOME>\ASBNode\install\logs\xmeta-install.log xmeta installation log <IS_HOME>\ASBNode\install\etc\history\xmeta-install-status.hist xmeta installation history file <IS_HOME>\ASBNode\install\bin\install.properties xmeta installation configuration file Client machine <IS_HOME>\ASBNode\install\logs\xmeta-install.log xmeta installation log <IS_HOME>\ASBNode\install\etc\history\xmeta-install-status.hist xmeta installation history file <IS_HOME>\ASBNode\install\bin\install.properties xmeta installation...