Issue 1 :
The sessions
in the worklet wkflw_ABCD01_STAGING_LOAD ,sessions
"s_m_ABCD01_STAGING_SESSION","s_m_ABCD01_ARCH_LOAD", in the
workflow wkflw_ABCD01_STAGING_LOAD failed with below error.
Message:
SQL Error [
ORA-01017:
invalid username/password; logon denied
ORA-02063:
preceding line from ABCD010LINK
Database
driver error...
Function
Name : Execute
SQL
Stmt : SELECT ABCD01_MNO.CUST_NO, ABCD01_MNO.CUST_SUBJ, ABCD01_MNO.CUST_USER,
ABCD01_MNO.CUST_ADMIN FROM ABCD01_TABLE
Oracle
Fatal Error
Database
driver error...
Function
Name : Execute
Solution :
Here
the query is firing on the source db, Password needs to be updated for DBLINK
present on the source database instance.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Issue
2 :
Sessions
and workflows are getting hung status when Informatica Infrastructure which was
built on Solaris 10
Solution :
This
is a known issue with (mail task ) rmail utility on the Solaris 10 operating
system ( Kernel version: SunOS 5.10 Generic_147440-27 ), for long term solution
you can remove the mail tasks whichever you are using and can use the command
task with mail-x utility.This is a Bug and it got confirmed from Informatica
and Oracle vendor
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
- Issue 3:
Debugger giving error “Error connecting to the DTM at [%s : %d]: [%s]”
Solution :
Where do we run the Debugger?
In designer, open the mapping which you want to debug.
Go to the tab: Mappingsà Debuggerà Start Debugger
It
gives us a series of windows in which we need to select the
Integration, Debugger session type, related session of the mapping,
related source/target/lookup connections.
How we solved?
In the 2nd window, it asked for Integration service and Debugger Session Type.
There are 3 Debugger session types:
The user who facing this issue was using the session “Use an existing session instance”
We suggested him to select “Create a debug session instance” so that we can check where it is going wrong.
After
creating new session instance (selected right parameters like session,
Source/target/lookup connections), the debugger started running fine (as
per the update from user).
Other resolutions:
Cause 1 :
Debugger
crashes because the ports used by debugger is being used by some other
application like Yahoo messenger, etc. Close those applications and try,
it will work.
Resolution:
In
Designer, Goto Tools --> Options --> Debug Tab --> CHECK the
box for TCP/IP- Pick a port between Min and Max automatically.
It should solve debugger crashing,
Hope it helps.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
- Issue 4
Debugger giving error “Error connecting to the DTM at [%s : %d]: [%s]”
This happens if you are trying to run your Debugger over NAT or VPN.
Resolution:
Add
a custom property "Debugger DTMHostName" without quotes with a value of
your full server name where the Integration service is hosted. Disable
then enable the Integration Service. Make sure the default port (or the
custom port) is "opened" on your firewall and your modem/server where
you set NAT entries.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
- Issue 5 :
Java Transformation Issue : Users were unable to compile the java transformation and getting the below error.
- Unable to find any class file
- Unable to create the java source file
Analysis :
While
analyzing the mapping design we found that there was a mismatch in data
type for the field (METRIC_ID) which was used as String in all the
transformations But inside the Java Code it was declared as Integer.
Resolution :
We
have advised the project team to make it as an Integer in all
transformations and validated the mapping which resolved the issue
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Issue 6 :
Informatica load got stucks with message "TIMEOUT BASED COMMIT POINT"
Solution :
Below are my experienced solutions to ‘TIMEOUT BASED COMMIT POINT’ issues.
•
on of the Session “s_m_XYZ_DBG_REPORT” in our heavy ODS load was
having same issue we created the index which resolved the issue.
•
one of the Fact Session (s_PQ_AR_FACT_LOAD) we involved DB team and
they did gave solution “Undo retention has been increased to 2400
seconds from 1800 seconds”.
•
You need to check the source data count how much it has increased
because in one of the case it was resolved by decreasing the commit
interval.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Issue 7 :
When the session recovery option is turned on at the session level Informatica session was throwing the below error.
NODE_PTDEV_SERD02190
: TM_6371 : The Integration Service will create recovery table
[PM_TGT_RUN_ID] in database [serd02190.ciscrim.com] because it does not
exist .
Database driver error...
Function Name : execute Direct
SQL Stmt : CREATE TABLE PM_TGT_RUN_ID (LAST_TGT_RUN_ID NUMBER NOT NULL)
Oracle Fatal Error
Database driver error...
Function Name : Execute Direct
Oracle Fatal Error] setting up [PM_TGT_RUN_ID] table.
Solution :
Will have to use the “Base Schema” account at Informatica connection
(Global Object) or else full privileges have to be given to the user
used at Informatica connection.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Issue 8:
One
of the worflow in the folder ABCD01 migrated to the new server that was
going to unknow status.Ran manulay again it was going to unknow
status.When ran from BackEnd workflow Manager got the following error.
(INFFORMIS) Start workflow: Request acknowledged
(INFFORMIS) Start workflow: ERROR: Workflow [ABCD01:wfklw_ABCD_RUN[version 1]]: Parameter file [/apps/Infa/staging/SrcFiles/ABCD01/INFA/ABCD.PAR] not found.Please check the Integration Service log for more information.
(INFFORMIS) Start workflow: ERROR: Workflow [ABCD01:wfklw_ABCD_RUN[version 1]]: Parameter file [/apps/Infa/staging/SrcFiles/ABCD01/INFA/ABCD.PAR] not found.Please check the Integration Service log for more information.
Solution:
The
parameter file was migrated from old server to new server ABCD.PAR and
changed the paths within the parameter file.Scheduled to next run again
the workflow came to unknown status and got the below error.
Start workflow: ERROR: Workflow [ABCD01:wfklw_ABCD_RUN[version 1]]: Parameter file
[/apps/Infa/staging/SrcFiles/ABCD01/INFA/ABCD.PAR] not found.Please check the Integration Service log for more information.
When checked in work-flow the parameter file was wrongly updated as
/apps/Infa/staging/SrcFiles/ABCD01/ABCD.PAR and its changed to
$PMSourceFileDir/ABCD01/INFA/ABCD.PAR. then the workflow ran fine.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Issue 9:
Problem Statement :
One of the session s_mL_AB_RUN_DFACT in our environment (running for the 3 days and has around 250K records to load)
Three times we tried to load this table, previous 3 occasions it failed with snapshot too old after running for 3 days.
It loads 110K records and then through put reduces to 1 rows/sec and session log shows TIMEOUT BASED COMMIT POINT.
Three times we tried to load this table, previous 3 occasions it failed with snapshot too old after running for 3 days.
It loads 110K records and then through put reduces to 1 rows/sec and session log shows TIMEOUT BASED COMMIT POINT.
Analyze:
When checked in sesslog the session was showing only
Message: TIMEOUT BASED COMMIT POINT
Message: TIMEOUT BASED COMMIT POINT
Message: TIMEOUT BASED COMMIT POINT
Solution :
1) It could be issue of long caching time..
2) with less Commit Interval(Decrease commit interval from 10080 to 5000).
3) Please check DTM buffer settings and and the resource usage and capacity of the host
5) Most of the times the issue might be with the undo retention that has to be increased.
2) with less Commit Interval(Decrease commit interval from 10080 to 5000).
3) Please check DTM buffer settings and and the resource usage and capacity of the host
4) Set the Maximum Memory Allowed for Auto memory attribute to 1024MB
or 2048MB
5) Most of the times the issue might be with the undo retention that has to be increased.
Example :
Undo retention has been increased to 2400 seconds from 1800 seconds. Now the undo table space is having 2 GB of free space.
NAME TYPE VALUE
-------------------- ------------ -----------
undo_management string AUTO
undo_retention integer 2400
undo_tablespace string UNDOTBS1
-------------------- ------------ -----------
undo_management string AUTO
undo_retention integer 2400
undo_tablespace string UNDOTBS1
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Issue 10 :
One of the user came up with the below issue in LinkedIn Stating
“we
are facing another issue we have migrated the code 1 month back when we
run the code manually from Informatica job is getting succeeded last
week we have scheduled the job in a scheduling tool when the job runs it
get succeeded in less than a minute but it is not loading the data and
we are not able to fetch the session logs, workflow logs Please find the
error unable to fetch the logs [LGS_10051] the registered log file does
not exist.”
Solution :
• Please do the following things.
1) Edit Session -> Properties ->Write Backward Compatible SessLog File -> Enable this check box.
2) Edit Workflow -> Properties ->Write Backward Compatible Workflow Log File ->Enable this check box.
3) Have the correct SessLog path in Properties->SessLog File Directory.
4) Edit Workflow -> Configure Concurrent Execution-> Enable this check box.
2) Edit Workflow -> Properties ->Write Backward Compatible Workflow Log File ->Enable this check box.
3) Have the correct SessLog path in Properties->SessLog File Directory.
4) Edit Workflow -> Configure Concurrent Execution-> Enable this check box.
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Issue 11 :
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Issue 11 :
WRT_8324
: Warning: Target Connection Group's connection doesn't support transactions.
Targets [TARGET] may not be loaded according to specified transaction boundary
rules!
Justification :
- This warning would be triggered only when a user-defined commit is enabled by default if the mapping has effective Transaction Control transformations. Because there session is not able to find the transaction boundaries since it was handled in Transaction Control transformation.
- When
we are running a source-based commit or
user-defined commit session, and the named targets may not recognize
transaction boundaries. This might happen with flat file targets or with bulk loading.
- Integration Service uses a source- or target-based, or user-defined commit. You can choose source- or target-based commit if the mapping has no Transaction Control transformation or only ineffective Transaction Control transformations. By default, the Integration Service performs a target-based commit.
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Superb explanation.. These posts are very much useful for the beginners... thanks for your dedication....
ReplyDelete