Is it possible to copy a slave into a new data source? I think that setting the slave id in the new datasource is needed.
If not, is there any plan to add this feature?
Thanks,
Is it possible to copy a slave into a new data source? I think that setting the slave id in the new datasource is needed.
If not, is there any plan to add this feature?
Thanks,
Is it possible to copy a slave into a new data source? I think that setting the slave id in the new datasource is needed.
If not, is there any plan to add this feature?
Thanks,
I'd like to know if you plan to add users groups or user roles to Mango. In this way, you can define privilegies for a hole group instead of a single user.
This will made many-users-systems able to use Mango.
I'd like to know if you plan to add users groups or user roles to Mango. In this way, you can define privilegies for a hole group instead of a single user.
This will made many-users-systems able to use Mango.
Where are you from? Post your mail and I'll get in touch.
De donde sos? Pasame tu mail para contactarnos.
Ok, I removed all DS and DP but each one of a kind and the load proccess worked as expected:
f = open('datos.mango', 'r')
import json
a = json.load(f)
a
{u'dataSources': [{u'updatePeriodType': u'SECONDS', u'retries': 0, u'transportType': u'TCP_KEEP_ALIVE', u'createSlaveMonitorPoints': False, u'name': u'SJR', u'updatePeriods': 20, u'enabled': True, u'encapsulated': False, u'port': 502, u'quantize': False, u'host': u'10.0.0.10', u'contiguousBatches': False, u'timeout': 10000, u'xid': u'DS_SJR', u'type': u'MODBUS_IP', u'alarmLevels': {u'POINT_WRITE_EXCEPTION': u'INFORMATION', u'POINT_READ_EXCEPTION': u'INFORMATION', u'DATA_SOURCE_EXCEPTION': u'URGENT'}}], u'dataPoints': [{u'loggingType': u'ON_CHANGE', u'engineeringUnits': u'DEGREES_CELSIUS', u'discardHighLimit': 1.7976931348623157e+308, u'purgeType': u'YEARS', u'name': u'02-ir0 Temp Impulsion AA1', u'purgePeriod': 1, u'intervalLoggingPeriod': 15, u'enabled': True, u'intervalLoggingType': u'INSTANT', u'discardLowLimit': -1.7976931348623157e+308, u'pointLocator': {u'additive': 0.0, u'multiplier': 0.10000000000000001, u'slaveMonitor': False, u'range': u'INPUT_REGISTER', u'offset': 0, u'modbusDataType': u'TWO_BYTE_INT_UNSIGNED', u'settableOverride': False, u'bit': 0, u'slaveId': 2}, u'chartRenderer': {u'numberOfPeriods': 2, u'type': u'IMAGE', u'timePeriodType': u'HOURS'}, u'eventDetectors': [], u'discardExtremeValues': False, u'tolerance': 0.0, u'xid': u'DP_SJR02-ir0', u'intervalLoggingPeriodType': u'MINUTES', u'textRenderer': {u'type': u'ANALOG', u'suffix': u'\xb0C', u'format': u'0.0'}, u'dataSourceXid': u'DS_SJR', u'defaultCacheSize': 1}]}
Around 10 DS with 20 DP each. I was able to copy from the browser and there were 6135 lines.
Is this suppose to be a valid json file? I tried to load it from Python but i get this error:
f = open('datos.mango', 'r')
f
<open file 'datos.mango', mode 'r' at 0xb77639c0>
json.load(f)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.6/json/init.py", line 267, in load
parse_constant=parse_constant, **kw)
File "/usr/lib/python2.6/json/init.py", line 307, in loads
return _default_decoder.decode(s)
File "/usr/lib/python2.6/json/decoder.py", line 319, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python2.6/json/decoder.py", line 336, in raw_decode
obj, end = self._scanner.iterscan(s, **kw).next()
File "/usr/lib/python2.6/json/scanner.py", line 55, in iterscan
rval, next_pos = action(m, context)
File "/usr/lib/python2.6/json/decoder.py", line 183, in JSONObject
value, end = iterscan(s, idx=end, context=context).next()
File "/usr/lib/python2.6/json/scanner.py", line 55, in iterscan
rval, next_pos = action(m, context)
File "/usr/lib/python2.6/json/decoder.py", line 227, in JSONArray
raise ValueError(errmsg("Expecting , delimiter", s, end))
ValueError: Expecting , delimiter: line 1038 column 27 (char 30609)
I had that Idea. Thanks for validating it :) But when I tried it was very slow and killed my browser. I'll try again.
Sorry, I wasn't clear. I'd like to make, let's say 100 or 1000 copies with one click/SQL-command-execution. Is there a way? anyone did this?
How do I clone a datasource with all it datapoints?
Thanks you!