• 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Speed?
#1
Hi all,

I'm post here having posted the same in the LiveCode forum as well - but this is a better place to do this.
I was wondering about the speed of LiveCloud operations.

I'm based in London and using the London servers. i'm connecting via a very fast WiFi connection (no proxies).
First cdb_ping(*table): 504.756927 ms
Average ping: 139.960051 ms

First cdb_pingNode(*table): 8.497 ms
Average PingNode 6.455 ms


The pingNode seems very respectable, not sure if ping is in the expected range.

I've timed some actions with LiveCloud, by storing 2 different tables in the cloud (21 records each, so quite a small number).
For each of these, there is first a check if a record exists (the premise that data in one or the other table may already exist in the database).

On average it takes about 2.5 seconds per table to store the data in the cloud (average 2476.5 ms). 

Possibly this is due to my algorithm: All data is stored already in a multidimensional array in LiveCode. I build a tInputA array of records to upload with a 'for each' loop

Pseudocode for each table:
Code:
local x, tInputA, tOutputA, tKey, tTableID

put cdb_TableID("tableName") into tTableID
Repeat for each element tArray in pArray
  add 1 to x
  put tArray[key1][key2] into tKey
  put cdb_query("keyName","=",tKey,"tableName","cloud","recordList") into tOutputA
  if tOutputA is empty then //this record does not already exist in the table
    put tArray[key1][key2] into tInputA[tTableID][x]["keyName1"]
    ...
    put tArray[keyX][keyZ] into  tInputA[tTableID][x]["keyNameX"]
  end if
end repeat
put cdb_batchCreate(tInputA,"cloud") into tOutputA

Keeping in mind that the full data may be >1000 records per table at a time and will involve at least 3 tables, this i suspect may be quite slow -- but not yet tested on large numbers.

Is this expected performance? 
I suspect my code is the culprit, any advice on speeding this up?
  Reply
#2
Hi Stam,

We need to understand better the structure of your data.

Are you able to take a screenshot of the variable watcher showing a view of your data?

How many times is the repeat looping?

Your query looks like it is trying to get all the record IDs of a given key from a single table. The fastest way to get all the recordIDs of a given table is to use:  https://docs.livecloud.io/List/

All this said I think you have developed a type of sync. I think you are trying to determine which record IDs are missing in the cloud. You then cdb_batchCreate all the missing data to the cloud. That is pretty clever. I think we can save you some time.

The fastest way to sync is use:  https://docs.livecloud.io/Sync/
This function will do all the heavy lifting for you in a single line of code. It will give you the freedom to consider data collisions and handle deleting records that are not in the source. 

It would look something like:
get cdb_sync("*", "tableNameHere", "local", false, false)
--The asterisk is used to indicate that you want to consider all records in both the source and target of the sync operation.
--First false prevents unique data in the cloud from being deleted
--Second false says not to compare the versions of the records and replace data in the cloud with what is in the local data.

Please feel free to experiment with both of those booleans to get the desired result.

Let me know how this goes for you.
  Reply
#3
(09-01-2020, 12:59 AM)mark_talluto Wrote: Hi Stam,

We need to understand better the structure of your data.

Thanks Mark,
In the sparse time i have for doing this (it's not my day job), I've been rewriting my app to to keep data in an complex array that would would make it simpler to upload everything to liveCloud with a single batchCreate in the hope that this would make it much quicker (my previous attempt had me doing this in a loop which is never a recipe for a happy life...)

However my efforts have stalled due an apparent showstopper bug - i've submitted an error report by email but it's mirrored in the bug section of this forum as well...
  Reply
#4
Hi Stam,

You are doing great! Let me know if the sync solves works for you.

Regarding your bug report: Are you on version 2.7.3? This version allows more than 400 keys per table issue you found. You can check the version by signing in and clicking on the "Changes" button at the top of the LCM window.
  Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)