Powerbuilder get DataStore Syntax? - powerbuilder

I have created a dynamic datastore and I need to destroy/create bitmaps during operation (invoices logo depending on the company selected)
I destroy all bitmaps with Modify and create new ones with Modify as well.
Although Modify returns empty string (OK) I cannot see any logos created (or destroyed).
The only way I am thinking of checking it is to somehw get the Datastore syntax (code) before and after the Modify clauses.
But in the manual it states that Datastores have no Syntax.
Is there any way to get the datastore code/objects?
Thank you

You should be able to use Describe to get the syntax:
lds.Describe("DataWindow.Syntax")
Here's a quick test:
datastore lds
lds = create datastore
lds.dataobject = 'd_emp'
clipboard(lds.Describe("DataWindow.Syntax"))
destroy lds

Related

App Engine datastore + memcache + get a page of data into cache

I am looking at a way to get a page of data into memcache from datastore. Basically a comment system like Facebook where you load a set of 10 comments at a time. Datastore persists each comment as an object.
I would load 10 comments into an array object and load that on cache with a page ID suffix for the key by convention.
Now the problem : Data store doesn't seem to promise an increment of 1 while creating auto ID's. checked this on SO - Autoincrement ID in App Engine datastore
Upon eviction, how can I load these 10 comments from datastore into cache of a particular range lets say - page #5 or #6 when I can't access datastore objects by an incremented key ?
Any suggestions are welcome, even if you feel the whole approach is flawed let me know.
I did explore google cloud SQL as an alternative to datastore which takes care of my paging and ID-increment problems, and felt its not the best option as I expect this comments table to grow into very large dataset eventually.
Thanks !

How to do UPDATE and DELETE on GAE db using python?

I have a beginner question.
I was reading the GAE reference and I only saw "SELECT" operation. I would need the counterparts of SQL's UPDATE and DELETE but all I could figure is this:
DELETE: .filter from the table to get to an item then call .remove() on it.
UPDATE: .filter from the table to get to an item then modify it and insert it back with .put(), but call .remove() on the original element.
Is this the proper and efficient way of doing these operations?
Update is essentially same as put when the element already exist. GAE will rewrite the entity when you do .put()
Example
Update
mymodel = MyModel.get_by_key_name('myKey')
mymodel.x = 123
mymodel.put()
No
GQL is not SQL.
You need to manipulate the entities in the datastore via models and keys.
Firstly you should consider abandoning all your preconceived ideas on how the datastore should work based on your SQL knowlege, once mentally prepared work through the getting started guide https://developers.google.com/appengine/docs/python/gettingstartedpython27/usingdatastore
Then progress to the docs on the Datastore. https://developers.google.com/appengine/docs/python/datastore/
If you have no preexisting appengine code base then use ndb rather than db.
If you need true SQL capabilities then look at Google Cloud SQL rather than the appengine datastore.

Trying to increase the performance of my GAE app

I am trying to use memcache to improve the performance.
Here is my model:
class ABC(db.Model):
some_property = db.StringProperty()
# more properties
class XYZ(db.Model):
another_property = db.StringProperty()
abc = db.ReferenceProperty(ABC,collection_name="xyzs")
# more properties
I have only two entities of ABC and 800 entities of XYZ
So, one of the feature of the app is to provide the excel sheet for all XYZ. The excel sheet has two columns.
First column is "another_property" and the second column is "some_property"(from ABC reference).
xyzs = XYZ.all()
for xyz in xyzs:
logging.info(xyz.another_property)
logging.info(xyz.abc.some_property)
With this approach,xyz.abc.some_property was making a datastore call every-time
Seeing this i decided to use memcache to store the abc reference in memory.
With memcache in use, i didn't see any major change in response time.
abcId = XYZ.abc.get_value_for_datastore(xyz).id()
#Get ABC reference from memcache if present else bring it from datastore and add it to memcahce.
Can please someone why i am not seeing any performance gain?
I was trying to do a get memcache for each XYZ entity.
I solved it by doing a bulk get memcache https://developers.google.com/appengine/docs/python/memcache/clientclass#Client_get_multi
The pseudo code :
abcIds = [str(XYZ.abc.get_value_for_datastore(xyz).id() for xyz in xyzs]
abcs = memcache.get_multi(abcIds) #This gives me a dictionary of id as key and ABC reference property as value
Here is the snapshot of improved version of my app.
Posting it as it might help others.
PS: I have a feeling this can also be improved.Help me in improving the answer.
As you probably will note, if you are starting out consider using ndb.
Now to my answer, you should use a prefetch reference property approach which is a lot more efficient see write up by nick johnson http://blog.notdot.net/2010/01/ReferenceProperty-prefetching-in-App-Engine
Which basically collects all the keys in the reference property, then does a single get of all the entities. You may find it performs as well as memcache, plus if entities are evicted from memcache (which will happen) you get all of your data.

Limit fields in Data Viewer

I am trying to do some simple reporting in the datastore viewer in GAE. Using GQL I want to show just a few fields of a record. Is this possible?
How do I take entity with fields:
f1 f2 f3 f4 f5 f6
and show
f1 f3 f5 f6
This is not possible. From the GQL Reference documentation:
Every GQL query always begins with either SELECT * or SELECT __key__.
And from the Differences with SQL section of the datastore overview:
When querying the datastore, it is not currently possible to return
only a subset of kind properties. The App Engine datastore can either
return entire entities or only entity keys from a query.
As for why this kind of limitation exists, the article about How Entities and Indexes are Stored gave a good insight regarding the technical aspect behind Google's Bigtable, the distributed database system powering App Engine's datastore. (And other Google products)
From the article, datastore entities are stored in several different Bigtables. An Entity Bigtable stores the entire properties of the entity, and several Index Bigtables stores the entity key sorted according to indexes of the entity.
When we perform a query, basically there are two step that happen. The first step is our query is being executed against the Index Bigtables, producing a set of entity key that matches our query. The second step is that the set of keys is then used to fetch the whole entity from the Entity Bigtable.
Therefore, when you execute your query starting with SELECT __key__, the datastore only need to do the first step and immediately return with the set of keys. When you execute your query starting with SELECT *, the datastore did both steps and return with the set of entities.
Now, regarding why queries like SELECT f1, f3, f5, f6 is not supported by the datastore, we need to look into further detail on what happened during the second step stated above. From the article, it is stated that on the Entity Bigtable:
Instead of storing each entity property as an individual column in the corresponding Bigtable row, a single column is used which contains a binary-encoded protocol buffer containing the names and values for every property of a given entity.
Since the low level protocol buffer stores the entire entity's properties as a single serialized data, it means querying only a subset of the entity's property actually would take an extra post-processing step of filtering the result set and taking only the queried properties. This would entail a performance degradation of the datastore, and is probably why it is not supported by Google at the moment.

Mass updates in Google App Engine Datastore

What is the proper way to perform mass updates on entities in a Google App Engine Datastore? Can it be done without having to retrieve the entities?
For example, what would be the GAE equivilant to something like this in SQL:
UPDATE dbo.authors
SET city = replace(city, 'Salt', 'Olympic')
WHERE city LIKE 'Salt%';
There isn't a direct translation. The datastore really has no concept of updates; all you can do is overwrite old entities with a new entity at the same address (key). To change an entity, you must fetch it from the datastore, modify it locally, and then save it back.
There's also no equivalent to the LIKE operator. While wildcard suffix matching is possible with some tricks, if you wanted to match '%Salt%' you'd have to read every single entity into memory and do the string comparison locally.
So it's not going to be quite as clean or efficient as SQL. This is a tradeoff with most distributed object stores, and the datastore is no exception.
That said, the mapper library is available to facilitate such batch updates. Follow the example and use something like this for your process function:
def process(entity):
if entity.city.startswith('Salt'):
entity.city = entity.city.replace('Salt', 'Olympic')
yield op.db.Put(entity)
There are other alternatives besides the mapper. The most important optimization tip is to batch your updates; don't save back each updated entity individually. If you use the mapper and yield puts, this is handled automatically.
No, it can't be done without retrieving the entities.
There's no such thing as a '1000 max record limit', but there is of course a timeout on any single request - and if you have large amounts of entities to modify, a simple iteration will probably fall foul of that. You could manage this by splitting it up into multiple operations and keeping track with a query cursor, or potentially by using the MapReduce framework.
you could use the query class, http://code.google.com/appengine/docs/python/datastore/queryclass.html
query = authors.all().filter('city >', 'Salt').fetch()
for record in query:
record.city = record.city.replace('Salt','Olympic')

Resources