Re: [WebDNA] Issues regarding static vs. dynamic pages

This WebDNA talk-list message is from

2011


It keeps the original formatting.
numero = 106153
interpreted = N
texte = Can't you just write the static file whenever the content or the keyword database is changed? This way changes are immediately visible and the page generated once. This is similar to memcached in theory which the sites that can withstand and Oprah mention would have. On Mon, Jan 24, 2011 at 9:48 AM, Dan Strong wrote: >>Or better yet, I would prefer to just write the results to a complete set >> of static web pages for the entire website. > > > > This is exactly what I ended up doing, except that I made the results static > includes and now include them in the site as opposed to having so many staic > actual pages. This way, the purpose of the CMS isn't lost. In doing this, I > also figured out how to make it so the friendly URLs can work properly and > will be testing and implementing them today. > > > > -Dan > > From: Kenneth Grome > Sent: Monday, January 24, 2011 5:33 AM > To: talk@webdna.us > Subject: [WebDNA] Issues regarding static vs. dynamic pages > >> >> every single word from each text of > >> >> each page is compared to a database in order to > >> >> create dynamic links: all the links from the websites > >> >> are in fact created "on the fly". > > If I had coded this type of feature I would not have done it this way -- > precisely because it is slow and unnecessarily demanding of resources with > every hit to the website. > > Instead I would create a script that processes "every single word from each > text of each page" only once a day, and I would have this script write the > resulting text with links to a new database. > > Or better yet, I would prefer to just write the results to a complete set of > static web pages for the entire website. Then the server can deliver these > static pages without requiring such a heavy load on WebDNA at the time each > page is requested. > > The end result is that the site could be made up of mostly static pages not > served by WebDNA. Yet those static pages would still be dynamically > generated by WebDNA once a day to make sure they are current and up-to-date > every 24 hours. > > Naturally the site admin should be able to run this script manually so he > can update these pages "on demand" whenever he makes an important update > that needs to be displayed to visitors immediately. > > I've done this kind of thing for several client sites before so I know it > works -- and it works well. It makes the website very fast because most > pages can be cached and gzipped by the server -- rather than being rendered > by WebDNA upon each request -- thus avoiding unnecessarily heavy demands on > WebDNA when those pages are requested. > > And by using a trigger to run my script in the middle of the night when > almost no one is visiting, the heavy WebDNA processing is performed at the > most opportune time in terms of server resources, so it never slows down the > experience of daytime visitors unnecessarily. > > Oh, you say that with your site consisting of mostly static pages you cannot > use WebDNA to let your visitors dynamically search your site? Don't worry > about it because they couldn't care less what software powers the search. So > instead of using WebDNA for site searches, just give your visitors a Google > search box, and let them do a custom Google search within your mostly static > site ... > > Not only are most people already familiar with the way Google search works > (so they like this option) but this little trick also insures that Google > *WILL* index all your static web pages -- on a regular basis -- probably > once a day if your page headers tell Google that those pages expire daily. > > Sincerely, > > Kenneth Grome > > --------------------------------------------------------- This message is > sent to you because you are subscribed to the mailing list . To unsubscribe, > E-mail to: archives: http://mail.webdna.us/list/talk@webdna.us Bug > Reporting: support@webdna.us > --------------------------------------------------------- This message is > sent to you because you are subscribed to the mailing list . To unsubscribe, > E-mail to: archives: http://mail.webdna.us/list/talk@webdna.us Bug > Reporting: support@webdna.us Associated Messages, from the most recent to the oldest:

    
  1. Re: [WebDNA] Issues regarding static vs. dynamic pages ("Dan Strong" 2011)
  2. Re: [WebDNA] Issues regarding static vs. dynamic pages (William DeVaul 2011)
  3. Re: [WebDNA] Issues regarding static vs. dynamic pages ("Dan Strong" 2011)
  4. [WebDNA] Issues regarding static vs. dynamic pages (Kenneth Grome 2011)
Can't you just write the static file whenever the content or the keyword database is changed? This way changes are immediately visible and the page generated once. This is similar to memcached in theory which the sites that can withstand and Oprah mention would have. On Mon, Jan 24, 2011 at 9:48 AM, Dan Strong wrote: >>Or better yet, I would prefer to just write the results to a complete set >> of static web pages for the entire website. > > > > This is exactly what I ended up doing, except that I made the results static > includes and now include them in the site as opposed to having so many staic > actual pages. This way, the purpose of the CMS isn't lost. In doing this, I > also figured out how to make it so the friendly URLs can work properly and > will be testing and implementing them today. > > > > -Dan > > From: Kenneth Grome > Sent: Monday, January 24, 2011 5:33 AM > To: talk@webdna.us > Subject: [WebDNA] Issues regarding static vs. dynamic pages > >> >> every single word from each text of > >> >> each page is compared to a database in order to > >> >> create dynamic links: all the links from the websites > >> >> are in fact created "on the fly". > > If I had coded this type of feature I would not have done it this way -- > precisely because it is slow and unnecessarily demanding of resources with > every hit to the website. > > Instead I would create a script that processes "every single word from each > text of each page" only once a day, and I would have this script write the > resulting text with links to a new database. > > Or better yet, I would prefer to just write the results to a complete set of > static web pages for the entire website. Then the server can deliver these > static pages without requiring such a heavy load on WebDNA at the time each > page is requested. > > The end result is that the site could be made up of mostly static pages not > served by WebDNA. Yet those static pages would still be dynamically > generated by WebDNA once a day to make sure they are current and up-to-date > every 24 hours. > > Naturally the site admin should be able to run this script manually so he > can update these pages "on demand" whenever he makes an important update > that needs to be displayed to visitors immediately. > > I've done this kind of thing for several client sites before so I know it > works -- and it works well. It makes the website very fast because most > pages can be cached and gzipped by the server -- rather than being rendered > by WebDNA upon each request -- thus avoiding unnecessarily heavy demands on > WebDNA when those pages are requested. > > And by using a trigger to run my script in the middle of the night when > almost no one is visiting, the heavy WebDNA processing is performed at the > most opportune time in terms of server resources, so it never slows down the > experience of daytime visitors unnecessarily. > > Oh, you say that with your site consisting of mostly static pages you cannot > use WebDNA to let your visitors dynamically search your site? Don't worry > about it because they couldn't care less what software powers the search. So > instead of using WebDNA for site searches, just give your visitors a Google > search box, and let them do a custom Google search within your mostly static > site ... > > Not only are most people already familiar with the way Google search works > (so they like this option) but this little trick also insures that Google > *WILL* index all your static web pages -- on a regular basis -- probably > once a day if your page headers tell Google that those pages expire daily. > > Sincerely, > > Kenneth Grome > > --------------------------------------------------------- This message is > sent to you because you are subscribed to the mailing list . To unsubscribe, > E-mail to: archives: http://mail.webdna.us/list/talk@webdna.us Bug > Reporting: support@webdna.us > --------------------------------------------------------- This message is > sent to you because you are subscribed to the mailing list . To unsubscribe, > E-mail to: archives: http://mail.webdna.us/list/talk@webdna.us Bug > Reporting: support@webdna.us William DeVaul

DOWNLOAD WEBDNA NOW!

Top Articles:

Talk List

The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...

Related Readings:

The List is Changing (1997) Not really WebCat (1997) NT version and O'reily's WebSite (1997) WebCat2 Append problem (B14Macacgi) (1997) Sorting error (1997) Running subtotal? (1998) See [shell]s (2008) help needed: Non-english characters in WebCatalog (1997) [WebDNA] Putting '&search' into URL killing all search contexts (2010) Sorting by date (1997) Emailer help....! (1997) WC2b15 - [HTMLx]...[/HTMLx] problems (1997) question on [delete] (1998) Look out kiwis! (2003) Search in 2 or more catalogs (1997) [OT] PHP?MySQL Help Needed (2003) Sendmail problem on UNIX (2000) Categorizing by data (2001) CommandSecurity? (1997) interesting (2003)