Posts

Using a value file in a parameter set in Information Server DataStage

Image
Question How do I create and use a value file within a parameter set in DataStage? Answer Using a value file in a parameter set allows you to set values of a parameter dynamically. For Example: Job A updates the value file. Job B uses a parameter set that points to that value file. When moving jobs from development to test or production you can update the value file to reflect different values for the different environments without having to recompile the job. To create a new Parameter Set, select File, New and select "Create new Parameter Set" This will Launch a Dialog shown below: Fill out the appropriate information on the General tab and the proceed to the Parameters Tab: In this tab, enter in the Parameters you wish to include in this Parameter Set. Note that you can also add existing Environmental Variables. The last tab, Values, allows you to specify a Value File name. This is the name of the file that will automatically be created on the Engine tier. This tab also all...

A DataStage job does not use the new value that is put in the Parameter set.

Problem(Abstract) A DataStage job does not use the new value that is put in the Parameter set. Cause If you make any changes to a parameter set object, these changes will be reflected in job designs that use this object up until the time the job is compiled. The parameters that a job is compiled with are the ones that will be available when the job is run (although if you change the design after compilation the job will once again link to the current version of the parameter set). Diagnosing the problem Examine the log entry "Environment variable settings" for parameter sets. If the parameter set specifies the value "(As predefined)", the parameter set is using the value that was used during the last compile. Resolving the problem If the value of the parameter set may be changed, you should specify a Value File for the parameters or set the parameters in the parameter set (including encrypted parameters) to $PROJDEF. ​​

How to set default values for Environment Variables without re-compiling DataStage jobs

Question Is it possible to set/change the default values for an environment variable without re-compiling the DataStage job? Answer Yes, it is possible to set/change the default values for an environment variables without recompiling the job. You can manage all of your environment variables from the DataStage Administrator client. To do this follow these steps: Open the Administrator client and select the project you are working in and click Properties. On the General tab click Environment. Create a new environment variable under the "User Defined" section or update the variable if it exists. Set the value of the variable to what want the DataStage job to inherit. Once you do this, close the Administrator Client so the variable is saved. Next, open the DataStage Designer client and navigate to the Job Properties. Add the environment variable name that you just created in the DataStage Administrator Client. Set the value of the new variable to $PROJDEF. This will inherit whate...

How to fix missing and underreplicated blocks - HDFS

$ su - < $hdfs_user >   $ hdfs fsck / | grep 'Under replicated' | awk - F ':' '{print $1}' >> /tmp/ under_replicated_files $ for hdfsfile in `cat /tmp/under_replicated_files` ; do echo "Fixing $hdfsfile :" ; hadoop fs - setrep 3 $hdfsfile ; done In above command, 3 is the replication factor. If you are using single datanode, it must be 1.  

Mule Getting error when hitting a REST API via HTTPS

Search for "tls-default.conf" in the folder. This will show you all the files for all the Runtimes that you have installed. there will be a property "enabledProtocols" make sure that it contains the TLSv1 in it as below enabledProtocols=TLSv1,TLSv1.1,TLSv1.2

Infosphere Datastage – How to Generate Datastage Technical Design Job Reports

Image
​ Report Generation Steps   Step 1:   Log in to DataStage from your client. Step 2:   navigate to the Job you wish to document and open/edit job. Step 3:   Go to 'File' > and Click on 'Generate Report'.   Step 4:  When the report Details dialog box appears: Choose Style Sheet Set 'Automatically Purge After:' Edit Report Name and Description, if desired, and click 'OK'.   Step 5   To view the report in DataStage, Click on the hyperlink in the DataStage box shown below.   Step 6:   The rendered Datastage job report will appear in the web browser.   Step 7: to save report for use outside of Infosphere, within the browser, navigate to 'File' > 'Save As'.   Step 4: navigate to the desired path and choose "Web Archive, single file (*.mht)" and Click Save.  I have found the "*.mht" to be the best format for exporting datastage reports, but you may choose any format which meets your needs.

IAM policy for Start/Stop Instance in region

We can use an EC2 ARN with the StartInstances and StopInstances actions. The policy will look like this: { "Version" : "2012-10-17" , "Statement" : [ { "Action" : [ "cloudwatch:*" , "ec2:Describe*" ], "Effect" : "Allow" , "Resource" : "*" } , { "Action" : [ "ec2:StartInstances" , "ec2:StopInstances" ], "Effect" : "Allow" , "Resource" : "arn:aws:ec2:us-east-1:123456789012:instance/*" } ] }