How do I clone a datasource?
-
Click the copy icon in the data source list.
-
Sorry, I wasn't clear. I'd like to make, let's say 100 or 1000 copies with one click/SQL-command-execution. Is there a way? anyone did this?
-
Doesn't everyone do that? :)
I'd suggest exporting your data source, and using a text editor to make clones (changing the name and xid values as necessary in the data source and points), and then importing.
-
I had that Idea. Thanks for validating it :) But when I tried it was very slow and killed my browser. I'll try again.
-
How many data sources were you importing? And how many points in each?
Another possibility is to write a program that will access the database and duplicate rows as necessary. You will of course need to reference Mango classes to deserialize objects, but you will also have the existing copying code to use as a reference.
-
Around 10 DS with 20 DP each. I was able to copy from the browser and there were 6135 lines.
Is this suppose to be a valid json file? I tried to load it from Python but i get this error:
f = open('datos.mango', 'r')
f
<open file 'datos.mango', mode 'r' at 0xb77639c0>
json.load(f)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.6/json/init.py", line 267, in load
parse_constant=parse_constant, **kw)
File "/usr/lib/python2.6/json/init.py", line 307, in loads
return _default_decoder.decode(s)
File "/usr/lib/python2.6/json/decoder.py", line 319, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python2.6/json/decoder.py", line 336, in raw_decode
obj, end = self._scanner.iterscan(s, **kw).next()
File "/usr/lib/python2.6/json/scanner.py", line 55, in iterscan
rval, next_pos = action(m, context)
File "/usr/lib/python2.6/json/decoder.py", line 183, in JSONObject
value, end = iterscan(s, idx=end, context=context).next()
File "/usr/lib/python2.6/json/scanner.py", line 55, in iterscan
rval, next_pos = action(m, context)
File "/usr/lib/python2.6/json/decoder.py", line 227, in JSONArray
raise ValueError(errmsg("Expecting , delimiter", s, end))
ValueError: Expecting , delimiter: line 1038 column 27 (char 30609) -
Yes, it is valid JSON. I'm not familiar with Python stack traces myself. Can you start cutting out sections to see if there is anything in particular that is causing the problem?
-
Ok, I removed all DS and DP but each one of a kind and the load proccess worked as expected:
f = open('datos.mango', 'r')
import json
a = json.load(f)
a
{u'dataSources': [{u'updatePeriodType': u'SECONDS', u'retries': 0, u'transportType': u'TCP_KEEP_ALIVE', u'createSlaveMonitorPoints': False, u'name': u'SJR', u'updatePeriods': 20, u'enabled': True, u'encapsulated': False, u'port': 502, u'quantize': False, u'host': u'10.0.0.10', u'contiguousBatches': False, u'timeout': 10000, u'xid': u'DS_SJR', u'type': u'MODBUS_IP', u'alarmLevels': {u'POINT_WRITE_EXCEPTION': u'INFORMATION', u'POINT_READ_EXCEPTION': u'INFORMATION', u'DATA_SOURCE_EXCEPTION': u'URGENT'}}], u'dataPoints': [{u'loggingType': u'ON_CHANGE', u'engineeringUnits': u'DEGREES_CELSIUS', u'discardHighLimit': 1.7976931348623157e+308, u'purgeType': u'YEARS', u'name': u'02-ir0 Temp Impulsion AA1', u'purgePeriod': 1, u'intervalLoggingPeriod': 15, u'enabled': True, u'intervalLoggingType': u'INSTANT', u'discardLowLimit': -1.7976931348623157e+308, u'pointLocator': {u'additive': 0.0, u'multiplier': 0.10000000000000001, u'slaveMonitor': False, u'range': u'INPUT_REGISTER', u'offset': 0, u'modbusDataType': u'TWO_BYTE_INT_UNSIGNED', u'settableOverride': False, u'bit': 0, u'slaveId': 2}, u'chartRenderer': {u'numberOfPeriods': 2, u'type': u'IMAGE', u'timePeriodType': u'HOURS'}, u'eventDetectors': [], u'discardExtremeValues': False, u'tolerance': 0.0, u'xid': u'DP_SJR02-ir0', u'intervalLoggingPeriodType': u'MINUTES', u'textRenderer': {u'type': u'ANALOG', u'suffix': u'\xb0C', u'format': u'0.0'}, u'dataSourceXid': u'DS_SJR', u'defaultCacheSize': 1}]} -
Can you try keeping a DS/DP that has info that looks like this:
File "/usr/lib/python2.6/json/init.py", line 267, in load
parse_constant=parse_constant, **kw)