File Datasource breaks on file upload
-
No I'm using the template and have placed it in the required templates csv directory then clicked compile. After a while, I noticed the datasource was not parsing the directory I had set for reading the data. I manually tried uploading the same file I showed you earlier to see if it would actually insert some datapoints in and update them, and yet nothing has gone in,
Something's definitely off.. -
Eh, have to ask :D. Similarly, is create points enabled for your data source?
Hmm. I did test that one some...
You can try uncommenting the print statements if you're running in a console, or you can
throw new java.lang.RuntimeException("What's happening and where?!");
which should convey whatever insight can be gleaned to the logs.Is it possible your file begins with the import prefix? This would prevent it from importing through REST, as well.
The ClassLoader for the importing class is associated with the DataSourceRT, So, if you restart the data source it will reload the class if you have modified /recompiled it. Otherwise, it may reload it at its discretion.
-
Ah the "Create Missing Points" tickbox did the job. Never thought to tick that as to me it meant it was adding additional point values into the mix. Maybe if it was named Generate datapoints from file or something to that effect I'd be alright. Non matter, looks like all is working and we're at the end of the proverbial slide!
Thanks for your patience Phil. Will keep you updated if I run into any other bizarre behaviour.
-
Always good to check the help for data sources! Glad we got it resolved!
I wonder if I'll do anything about reloading the class whenever the compile button is hit... That seems like it could be more intuitive. Normally I find myself renaming my test file and hitting save to get a poll to happen, though!
-
Hi again Phil, would you be willing to explain in further detail how to run this in the console? I need to do some further programming and debugging and would be good to have this working for files that have a different number of logging devices - some have two others one or even three...
-
Would also need to have a way to remember/recall the first few rows to designate the deviceName, name, and XID values if possible...
-
Running it in the console means starting Mango on the command line using either
ma-start.bat
or./ma.sh start
from the Mango/bin/ directory.it is possible to store and recall them. In the code you posted, the "headerMap" is storing an integer (position in the row) as a key for the column header. You can do something like that (declare a member variable to your class, assign to it and then refer to it in subsequent calls to 'importRow').
It is possible to pass in "deviceName" / "xid" / "name" / "identifier" (data file data point property, the first argument in the ImportPoint's we are adding to parsedPoints) by passing those attributes in the "extraParams" map for the first import point with that identifier. So, had you never run the importer before, and you put,
extraParams.put("deviceName", "Streats");
then the points would have been created with that device name
-
Excellent, I'll give it a crack. Thanks Phil!
-
This post is deleted! -
A little confused with an error in the system alarms. Not sure if I should start a new thread or can continue in this one since it relates to the same data and importing code...
I receive Urgent level errors when importing data that says "Failed to find all expected points", I don't know if this is because it expects all datapoints to be in a single file when it parses them (because each file hosts a different sensor and its readings) or if it's my code, Will add below:
import java.util.HashMap; import java.util.Map; import org.joda.time.format.DateTimeFormat; import org.joda.time.format.DateTimeFormatter; import com.infiniteautomation.datafilesource.contexts.AbstractCSVDataSource; import com.infiniteautomation.datafilesource.dataimage.NumericImportPoint; public class StreatsCsvImporter extends AbstractCSVDataSource { private DateTimeFormatter dtf = DateTimeFormat.forPattern("dd/MM/yyyy hh:mm:ss a"); private Map<Integer, String> headerMap = new HashMap<Integer, String>(); @Override public void importRow(String[] row, int rowNum) { if( row.length <= 1) return; /* Extract devicename from first line in file */ if(rowNum==0) { /* for( int i=0; i<row.length; i++) { System.out.println(row*); } return; */ String loc = row[0].split("=")[1]; /* System.out.println(loc); */ String deviceName = row[2].split("=")[1]; /* System.out.println(deviceName); */ this.headerMap.put(0, loc.concat(" - ").concat(deviceName) ); /* System.out.println(this.headerMap.get(0) ); */ return; } /* extract point names from 2nd column onwards in 2nd line of file */ if(rowNum==1) { for(int i=1; i<row.length; i++) { headerMap.put(i,row*); } return; } long dt; String timeString = row[0].replace(":00", ":00:00").replace("a.m.", "AM").replace("p.m.", "PM"); /* System.out.print("Timestring: " + timeString + "\n"); */ try { dt = dtf.parseDateTime(timeString).getMillis(); } catch(Exception e) { // // Gobble // e.printStackTrace(); return; } // /* Extra params option to set deviceName property */ Map<String, String> extraParams = new HashMap<String,String>(); extraParams.put("deviceName", headerMap.get(0) ); for(int i=1; i<row.length; i++) { this.parsedPoints.add(new NumericImportPoint(headerMap.get(i), Double.parseDouble(row*), dt, extraParams)); /* this.parsedPoints.add(new NumericImportPoint("DP_T", Double.parseDouble(row[2]), dt, extraParams)); */ } } }
-
You are correct that it expects all data points in every file. It's raised as a data source error but it probably should be more specific, so that you can turn it off. I'll bring that up, as it would be fairly easy to break that out into its own event type.
-
If you could do that I'd be very grateful. Would be good to be able to ignore it if I could.
-
This was added and should be released fairly soon.