Re: well sort of - database design
This WebDNA talk-list message is from 2003
It keeps the original formatting.
numero = 50332
interpreted = N
texte = >On 5/12/03 9:53 PM, Andrew Simpson
>wrote:>>> Well the thing here is that if your appending to the databases on each user>> action, i don't think it matters soo much how big the tables get... it just>> sticks the row at the end of the table and goes on its marry way.>>>> but thats about when you flush the databases, take a copy of them and put>> them on your development server to run the queries... that way not nailing>> your production server with the big loopey stuff.>>> > this worked fine for me with database files upwards of 60mb... >just need the> > ram baby! appends didn't suffer at all with the biggie sized db's>Good point.>>>> its just when it comes time to running your queries that you would have>> issues.>But here lies the rub... Part of what is being modeled it the dynamic>content being effected by patterns of exposure. This leads into particular>advertisements based on viewing patterns, as well as intelligent>recommendations for forum discussions and product features based again on>those patterns (queries). So while I could easily move and use when it comes>to data mining (good point by the way), the on the fly smart site' concept>would definitely be hurt by information that wasn't optimized as much as itcould be reducing search and result times.By the way, you wouldn't just need the ram baby if you use appendfile instead of append on the live server. Webdna never needs to open it as a db file if the processing is going to happen on a different computer.How often does this data really need to be evaluated in order to change the parameters on the live site? Minute by minute? Hourly? Only once a day?If 'once a day' is enough, then you could still do the data processing on a separate computer yet retain the dynamic nature. And if your 'separate computer' is online, you can still automate it to retrieve the data from the live server, process it, and upload the results to the live server. Then you wouldn't have to do it manually and you would still have a 'smart site'.:)-- Sincerely,Kenneth Grome-------------------------------------------------------------My programmers will write WebDNA code for you at $27 an hour!--------------------------------------------------------------------------------------------------------------------------This message is sent to you because you are subscribed to the mailing list .To unsubscribe, E-mail to: To switch to the DIGEST mode, E-mail to Web Archive of this list is at: http://webdna.smithmicro.com/
Associated Messages, from the most recent to the oldest:
>On 5/12/03 9:53 PM, Andrew Simpson >wrote:>>> Well the thing here is that if your appending to the databases on each user>> action, i don't think it matters soo much how big the tables get... it just>> sticks the row at the end of the table and goes on its marry way.>>>> but thats about when you flush the databases, take a copy of them and put>> them on your development server to run the queries... that way not nailing>> your production server with the big loopey stuff.>>> > this worked fine for me with database files upwards of 60mb... >just need the> > ram baby! appends didn't suffer at all with the biggie sized db's>Good point.>>>> its just when it comes time to running your queries that you would have>> issues.>But here lies the rub... Part of what is being modeled it the dynamic>content being effected by patterns of exposure. This leads into particular>advertisements based on viewing patterns, as well as intelligent>recommendations for forum discussions and product features based again on>those patterns (queries). So while I could easily move and use when it comes>to data mining (good point by the way), the on the fly smart site' concept>would definitely be hurt by information that wasn't optimized as much as itcould be reducing search and result times.By the way, you wouldn't just need the ram baby if you use appendfile instead of append on the live server. Webdna never needs to open it as a db file if the processing is going to happen on a different computer.How often does this data really need to be evaluated in order to change the parameters on the live site? Minute by minute? Hourly? Only once a day?If 'once a day' is enough, then you could still do the data processing on a separate computer yet retain the dynamic nature. And if your 'separate computer' is online, you can still automate it to retrieve the data from the live server, process it, and upload the results to the live server. Then you wouldn't have to do it manually and you would still have a 'smart site'.:)-- Sincerely,Kenneth Grome-------------------------------------------------------------My programmers will write WebDNA code for you at $27 an hour!--------------------------------------------------------------------------------------------------------------------------This message is sent to you because you are subscribed to the mailing list .To unsubscribe, E-mail to: To switch to the DIGEST mode, E-mail to Web Archive of this list is at: http://webdna.smithmicro.com/
Kenneth Grome
DOWNLOAD WEBDNA NOW!
Top Articles:
Talk List
The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...
Related Readings:
BinaryBody problem ...... (2003)
WebCommerce: Folder organization ? (1997)
PSC recommends what date format yr 2000??? (1997)
[TCPSend] and whois? (1999)
Re:2nd WebCatalog2 Feature Request (1996)
Mozilla/4. and Browser Info.txt (1997)
Locking up with WebCatalog... (1997)
any limitations on # of users.db entries (1999)
Re:Frames and cart values (1998)
WebCat2.0 [format thousands .0f] no go (1997)
Searching multiple fields from one form field (1997)
[showif [getcookie otherDomain]=yes] inside a [TCP connect] will work? (2000)
Date search - yes or no (1997)
filemaker - orderfile (1997)
Merging databases (1997)
[shownext] help (2002)
javascript popup (2005)
emailer w/F2 (1997)
accountnum using [listwords] (2001)
[WebDNA] Barcode (2014)