Forums / Setup & design / large datasets

large datasets

Author Message

Dennis Jessen

Friday 09 March 2007 6:31:32 am

Hello

i have the oppertunity to use eZ publish to develop an webshop, the experience I have with eZ is from rather 'small content' sites.

at some point in time the webshop will contain around 2.5 million objects/products with possibly as much as 30 attributes. This is just the products.
On average i would think that 5 of these attributes would be empty / non-existant.

i havent had any experinces yet with datasets of this size, and wouldn't mind some input from those that have.

some questions i have, among many!

Should I use the contentobject model to store products, giving many rows of contentObject_attributes ( 25*2.5mil ) * 2.5mil contentObjects.

Would I be better of extending the persistentObject, and if I do, what could the possible drawbacks be ( caching etc ? ).

To start with we have a dedicated server with mysql and one dedicated webserver, and possibly we would use the ezDBcachehandler to prepare for more webservers.
would this be overkill and are there any issues like image fetching etc.

Obviously all this is something I just have to test to gain my own experinces, but nonetheless any input would be much appreciated.

grettings

Dennis