A inhouse vb.net ORM has been developed using stored procedures to get the data to populate individual objects in a complex hierarchy. We have moved from a stored procedures for each mapping action to dynamic sql which is built up at runtime and executed which should increase functionality and performance. Each class has its own mapping code stating property to db column mapping, table, cardinality etc.
My issue is that due to complexities of the object model, an object being built by the ORM/mapper code will call potentially more data than is needed and thus hamper performance.
Is this an issue with the object model or the ORM or both?
Could anybody suggest a design pattern/idea to help administer this issue?
My first thoughts were that we should have extra functionality in out mapping classes to control the depth of the data retrieved but how could this be achieved cleanly i don't know. any ideas or thought on this subject will be greatly recieved.
preguntado el 24 de agosto de 12 a las 10:08
ORMs like Hibernate deal with this scenario using "Lazy Fetch" strategy where the object associations to any depth are maintained as is, but the objects in the hierarchy are fetched on demand when they are accessed for the first time. You might want to look at Patrón de proxy to get an idea of how this could be done.