• Recent
    • Tags
    • Popular
    • Register
    • Login

    Please Note This forum exists for community support for the Mango product family and the Radix IoT Platform. Although Radix IoT employees participate in this forum from time to time, there is no guarantee of a response to anything posted here, nor can Radix IoT, LLC guarantee the accuracy of any information expressed or conveyed. Specific project questions from customers with active support contracts are asked to send requests to support@radixiot.com.

    Radix IoT Website Mango 3 Documentation Website Mango 4 Documentation Website Mango 5 Documentation Website

    File Datasource breaks on file upload

    User help
    2
    19
    3.0k
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • phildunlapP
      phildunlap
      last edited by phildunlap

      Eh, have to ask :D. Similarly, is create points enabled for your data source?

      Hmm. I did test that one some...

      You can try uncommenting the print statements if you're running in a console, or you can throw new java.lang.RuntimeException("What's happening and where?!"); which should convey whatever insight can be gleaned to the logs.

      Is it possible your file begins with the import prefix? This would prevent it from importing through REST, as well.

      The ClassLoader for the importing class is associated with the DataSourceRT, So, if you restart the data source it will reload the class if you have modified /recompiled it. Otherwise, it may reload it at its discretion.

      1 Reply Last reply Reply Quote 0
      • MattFoxM
        MattFox
        last edited by

        Ah the "Create Missing Points" tickbox did the job. Never thought to tick that as to me it meant it was adding additional point values into the mix. Maybe if it was named Generate datapoints from file or something to that effect I'd be alright. Non matter, looks like all is working and we're at the end of the proverbial slide!

        Thanks for your patience Phil. Will keep you updated if I run into any other bizarre behaviour.

        1 Reply Last reply Reply Quote 0
        • phildunlapP
          phildunlap
          last edited by phildunlap

          Always good to check the help for data sources! Glad we got it resolved!

          I wonder if I'll do anything about reloading the class whenever the compile button is hit... That seems like it could be more intuitive. Normally I find myself renaming my test file and hitting save to get a poll to happen, though!

          1 Reply Last reply Reply Quote 0
          • MattFoxM
            MattFox
            last edited by

            Hi again Phil, would you be willing to explain in further detail how to run this in the console? I need to do some further programming and debugging and would be good to have this working for files that have a different number of logging devices - some have two others one or even three...

            MattFoxM 1 Reply Last reply Reply Quote 0
            • MattFoxM
              MattFox @MattFox
              last edited by

              Would also need to have a way to remember/recall the first few rows to designate the deviceName, name, and XID values if possible...

              1 Reply Last reply Reply Quote 0
              • phildunlapP
                phildunlap
                last edited by

                Running it in the console means starting Mango on the command line using either ma-start.bat or ./ma.sh start from the Mango/bin/ directory.

                it is possible to store and recall them. In the code you posted, the "headerMap" is storing an integer (position in the row) as a key for the column header. You can do something like that (declare a member variable to your class, assign to it and then refer to it in subsequent calls to 'importRow').

                It is possible to pass in "deviceName" / "xid" / "name" / "identifier" (data file data point property, the first argument in the ImportPoint's we are adding to parsedPoints) by passing those attributes in the "extraParams" map for the first import point with that identifier. So, had you never run the importer before, and you put,

                extraParams.put("deviceName", "Streats");
                

                then the points would have been created with that device name

                1 Reply Last reply Reply Quote 0
                • MattFoxM
                  MattFox
                  last edited by

                  Excellent, I'll give it a crack. Thanks Phil!

                  1 Reply Last reply Reply Quote 0
                  • MattFoxM
                    MattFox
                    last edited by MattFox

                    This post is deleted!
                    1 Reply Last reply Reply Quote 0
                    • MattFoxM
                      MattFox
                      last edited by

                      A little confused with an error in the system alarms. Not sure if I should start a new thread or can continue in this one since it relates to the same data and importing code...

                      I receive Urgent level errors when importing data that says "Failed to find all expected points", I don't know if this is because it expects all datapoints to be in a single file when it parses them (because each file hosts a different sensor and its readings) or if it's my code, Will add below:

                      import java.util.HashMap;
                      import java.util.Map;
                      
                      import org.joda.time.format.DateTimeFormat;
                      import org.joda.time.format.DateTimeFormatter;
                      
                      import com.infiniteautomation.datafilesource.contexts.AbstractCSVDataSource;
                      import com.infiniteautomation.datafilesource.dataimage.NumericImportPoint;
                      
                      public class StreatsCsvImporter extends AbstractCSVDataSource {
                      
                      	private DateTimeFormatter dtf = DateTimeFormat.forPattern("dd/MM/yyyy hh:mm:ss a");
                      	private Map<Integer, String> headerMap = new HashMap<Integer, String>();
                      	
                      	
                      	@Override
                      	public void importRow(String[] row, int rowNum) {
                      		if( row.length <= 1)
                      			return;
                      		/* Extract devicename from first line in file */
                      		if(rowNum==0)
                      		{
                      			/* for( int i=0; i<row.length; i++)
                      			{
                      				System.out.println(row*);
                      			}
                      				return; */				
                      			String loc = row[0].split("=")[1];
                      			/* System.out.println(loc); */
                      			String deviceName = row[2].split("=")[1];
                      			/* System.out.println(deviceName); */
                      			this.headerMap.put(0, loc.concat(" - ").concat(deviceName) );
                      			/* System.out.println(this.headerMap.get(0) ); */
                      			return;
                      		}
                      		/* extract point names from 2nd column onwards in 2nd line of file */
                      		if(rowNum==1)
                      		{
                      			for(int i=1; i<row.length; i++)
                      			{
                      				headerMap.put(i,row*);
                      			}
                      			return;
                      		}
                      			
                      		long dt;
                      		String timeString = row[0].replace(":00", ":00:00").replace("a.m.", "AM").replace("p.m.", "PM");
                      		/* System.out.print("Timestring: " + timeString + "\n"); */
                      		try {
                      			dt = dtf.parseDateTime(timeString).getMillis();
                      		} catch(Exception e) {
                      		//	// Gobble
                      			// e.printStackTrace();
                      			return;
                      		}
                      		
                      		// /* Extra params option to set deviceName property */
                      		Map<String, String> extraParams = new HashMap<String,String>();
                      		extraParams.put("deviceName", headerMap.get(0) );
                      		
                      		for(int i=1; i<row.length; i++)
                      		{
                      			this.parsedPoints.add(new NumericImportPoint(headerMap.get(i), Double.parseDouble(row*), dt, extraParams));
                      		/* this.parsedPoints.add(new NumericImportPoint("DP_T", Double.parseDouble(row[2]), dt, extraParams)); */
                      		}
                      	}
                      
                      }
                      
                      1 Reply Last reply Reply Quote 0
                      • phildunlapP
                        phildunlap
                        last edited by

                        You are correct that it expects all data points in every file. It's raised as a data source error but it probably should be more specific, so that you can turn it off. I'll bring that up, as it would be fairly easy to break that out into its own event type.

                        1 Reply Last reply Reply Quote 0
                        • MattFoxM
                          MattFox
                          last edited by

                          If you could do that I'd be very grateful. Would be good to be able to ignore it if I could.

                          1 Reply Last reply Reply Quote 0
                          • phildunlapP
                            phildunlap
                            last edited by

                            This was added and should be released fairly soon.

                            1 Reply Last reply Reply Quote 1
                            • First post
                              Last post