Customer Portal

XML Extract Datamapping Issues.

Comments 12

  • Avatar
    slechtaj
    0
    Comment actions Permalink
    Hi Siddhant,

    I assume the issue is caused by the fact you use auto-mapping in your XMLExtract mapping. Auto-mapping maps the fields (no matter if it's element or attribute) to output metadata fields with the same name. All you need to do is to turn off the auto mapping and map the elements and attributes manually to the output fields as required. See the below screenshot for more information.

    MapXMLExtractFields.png

    Hope it helps!
  • Avatar
    siddhant
    0
    Comment actions Permalink
    Thank you sooooooooooooooooo much..

    I require your one more help pls


    I am reading a file from the source below

    https://api-public.guidebox.com/v1.43/U ... ovie/19975

    using jsonExtract

    I have the issue that, each time to load a new record , the number that is 19975 at the end of that URL must change dynamically, it will be provided by my databases actually they are the movie id's .
    Is there any way by which i can do that

    so actually my url must be like this

    https://api-public.guidebox.com/v1.43/U ... Biv/movie/{value returned form database table}.

    and each time i load a new set of data i want to write it simultaniously on a jason file on my drive
  • Avatar
    slechtaj
    0
    Comment actions Permalink
    Hi Siddhant,

    You can prepare the URL with a reformat and send it through the edge to the input port of the JSONExtract. In File URL attribute you will use the input port with processing type source. Resulting value of the File URL component will look like this:
    port:$0.yourUrlField:source
  • Avatar
    siddhant
    0
    Comment actions Permalink
    Hi slechtaj,
    Thank You So much once again.

    I have manage to prepare those url any how, but now the issue i am facing is quite serious . There are almost 94000 such url's
    which i am giving an input to my JSON EXTRACT component in its file URL property.

    But when I run the graph JSON EXTRACT fails for any of the random url , giving me error "Url not reachable, check if the URL is Valid or not". but when I checked that URL on the WEB browser it was accessable.

    But in this case my whole graph fails, to avoid this faliure and to re access that failed URL from JSON EXTRACT what can i do?
    please suggest me some thing as early as possible...........
  • Avatar
    siddhant
    0
    Comment actions Permalink
    hi slechtaj,

    I had one more query regarding accessing the same urls from different components in the same graph.
    Is it possible that i can hit 1 lakh pre prepared urls in database from many different JSON EXTRACT Component at a time
    will there be any dead lock situation
  • Avatar
    slechtaj
    0
    Comment actions Permalink
    Hi,

    I'm sorry I didn't realize you need to process remote file. In this case, you can get the content of the file using HTTPConnector. Simply pass the URL into the HTTPConnector using Input mapping attribute and then map the content of the file to the output field using the Output mapping attribute. In JSONExtract make sure the processing type is set to discrete.
  • Avatar
    siddhant
    0
    Comment actions Permalink
    Hi slechtaj,
    I have done according to what you said. The graph is working now. Thank you soooo much.
    I am highly grateful for your quick response.
  • Avatar
    siddhant
    0
    Comment actions Permalink
    Hi Slechtaj,

    I had a query regarding the mapping of nested child elements in JSON EXTRACT component.
    I have to map all the child elements to a single output port.
    I am attaching a graph and my source file below. Please have a look.
    This file contains many child elements which are further nested, I want to map them all to the output port 0 .
    is it possibble??? please let me know as early as possible

    Thank You....Copy of new-graph.grf




    Copy of new-graph.grf
  • Avatar
    slechtaj
    0
    Comment actions Permalink
    Hi Siddhant,

    I am not sure if I understand what you need. However, if your goal is to get all the information from the XML file into one Clover record, you will have to use multiple output ports at first and then join them together using a key (in your case it might be id of the record).
    You should also keep in mind two things:

    • First you have to always start the mapping from the most nested object (for example link attribute of the facebook object. Each nested object has access to its parent object values.

    • Second JSONExtract uses SAX parser, therefore if an object has multiple attributes (e.g. facebook element has facebook_id and link) you should set the mapping on the last attribute, otherwise you wouldn't be able to map the value of the other attributes.


    I have prepared an example that should help you understand the logic - see the attached graph.
  • Avatar
    siddhant
    0
    Comment actions Permalink
    Hi slechtaj,

    Thank You soooo much for your reply on my data mapping issue query.
    Here now i am facing a new issue which i am not able to understand.
    my graph fails when i am trying to load major data load in database ,my graph fails. i am not able to understand the error logs .
    I am attaching the error logs below ,can you please help me to figure out what is the issueand how can i resolve it.Error_Log.txt
  • Avatar
    siddhant
    0
    Comment actions Permalink
    Hi slechtaj,

    I had a query regarding error handling in cover ETL. Is there any possible way in which i may be able to trap the error thrown by any of the component ie my dbOutputTable or JSONExtract component in my graph and continue the execution of graph to fetch data from web server and insert it in to database.i am fetching bulk data from web servers and suddenly i get json parser error and my graph fails . is there any way by which i can resume the graph from point of faliure

    Thank You,
    Siddhant Dilip Satav
  • Avatar
    slechtaj
    0
    Comment actions Permalink
    Hi Siddhant,

    To you first question, the log is from a parent graph so it contains only very limited information about the child graph. However, it seems you are facing some kind of network connection issues. I would recommend you first checking if you are able to do the same action from another client application.

    Regarding the second question, Clover components of this type are usually able to handle this situations, you can define the amount of failed records on a component (DBOutputTable) and attach an output port to collect these records for further processing. When it comes to the JSONExtract, or XMLExtract components, you should always make sure at first if the document you want to read is valid and matches the structure your mapping is tailored to.

    Hope it helps.

Please sign in to leave a comment.