Customer Portal

troubleshooting a postgresql JNDI connection

Comments 4

  • Avatar
    ramiro
    0
    Comment actions Permalink
    update on this issue: the JNDI setup is actually working, the problem is that the DBExecute clover graphs don't work with JNDI connections, they just timeout (CloverETL Server 3.5.2.12/12)

    Does anybody know why?
  • Avatar
    slechtaj
    0
    Comment actions Permalink
    Hi Ramiro,

    What kind of statements are you running? Can you share the graph and the whole log with us (ideally in debug log level)?
    In general for DML statement you should use DBInputTable / DBOutputTable. However, this does not apply to Redshift, since another process is recommended – see our blog for more information.
  • Avatar
    ramiro
    0
    Comment actions Permalink
    Thanks for the RedshiftWriter Subgraph recommendation Jan, it will be worth exploring, unfortunately we are stuck on version 3.5 of CloverETL for the moment, hopefully we can make DBEXEC graphs to work with JNDI instead.

    This is the particular SQL statement I am running (but in fact, it can be any sql):

    insert into my_schema.transform_times (description, site_id, timestamp, batch_id, identifier) values ('staging_fact_tr_user_has_state_by_day_active', 777, getdate(), 149, 'start');


    This is the error message I am getting:

    Component [truncate staging tables:TRUNCATE_STAGING_TABLES] finished with status ERROR.
    Component pre-execute initialization failed.
    Cannot establish DB connection to JNDI:java:comp/env/redshift Cannot get a connection, pool error Timeout waiting for idle object


    If I switch my connection from the JNDI to a regular postgresql driver connection, the same graphs and sql works perfect.

    Attached are the log file (in debug level), and also the source code of the graph I am running. Let me know if I can send you any other information that may help troubleshoot, thank you!
  • Avatar
    slechtaj
    0
    Comment actions Permalink
    I have already sent you the answer via email, but I will post the answer here as well for other users.

    First of all I would definitely use Redshift JDBC driver instead of PostgreSQL, although it should work in most cases. You can find the Redshift drivers and further information about their usage in Redshift documentation here.

    Secondly for writing data (INSERT statements) you should always use DBOutputTable. This applies also to reading - you should use DBInputTable for SELECT operations. Please refer to DBExecute, DBInputTable, [url=http://doc.cloveretl.com/documentation/UserGuide/index.jsp?topic=/com.cloveretl.gui.docs/docs/dboutputtable.html]DBOutputTable articles in our documentation for more information.

    However, I would strongly recommend you the other approach (blog article) - using COPY command. This approach is also recommended by Amazon: "We strongly recommend using the COPY command to load large amounts of data. Using individual INSERT statements to populate a table might be prohibitively slow." [Redshift documentation - Using a COPY Command to Load Data]
    You might also find useful the Redshift documentation article that deals with the COPY command.

Please sign in to leave a comment.