Re: [WebDNA] Issues regarding static vs. dynamic pages

This WebDNA talk-list message is from

2011


It keeps the original formatting.
numero = 106151
interpreted = N
texte = This is a multi-part message in MIME format. ------=_NextPart_000_0007_01CBBB92.B161B120 Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable >Or better yet, I would prefer to just write the results to a complete = set of static web pages for the entire website. This is exactly what I ended up doing, except that I made the results = static includes and now include them in the site as opposed to having so = many staic actual pages. This way, the purpose of the CMS isn't lost. In = doing this, I also figured out how to make it so the friendly URLs can = work properly and will be testing and implementing them today. -Dan From: Kenneth Grome=20 Sent: Monday, January 24, 2011 5:33 AM To: talk@webdna.us=20 Subject: [WebDNA] Issues regarding static vs. dynamic pages > >> every single word from each text of > >> each page is compared to a database in order to > >> create dynamic links: all the links from the websites > >> are in fact created "on the fly". If I had coded this type of feature I would not have done it this way -- = precisely because it is slow and unnecessarily demanding of resources = with every hit to the website.=20 Instead I would create a script that processes "every single word from = each text of each page" only once a day, and I would have this script = write the resulting text with links to a new database.=20 Or better yet, I would prefer to just write the results to a complete = set of static web pages for the entire website. Then the server can = deliver these static pages without requiring such a heavy load on WebDNA = at the time each page is requested. The end result is that the site could be made up of mostly static pages = not served by WebDNA. Yet those static pages would still be dynamically = generated by WebDNA once a day to make sure they are current and = up-to-date every 24 hours. Naturally the site admin should be able to run this script manually so = he can update these pages "on demand" whenever he makes an important = update that needs to be displayed to visitors immediately.=20 I've done this kind of thing for several client sites before so I know = it works -- and it works well. It makes the website very fast because = most pages can be cached and gzipped by the server -- rather than being = rendered by WebDNA upon each request -- thus avoiding unnecessarily = heavy demands on WebDNA when those pages are requested. And by using a trigger to run my script in the middle of the night when = almost no one is visiting, the heavy WebDNA processing is performed at = the most opportune time in terms of server resources, so it never slows = down the experience of daytime visitors unnecessarily. Oh, you say that with your site consisting of mostly static pages you = cannot use WebDNA to let your visitors dynamically search your site? = Don't worry about it because they couldn't care less what software = powers the search. So instead of using WebDNA for site searches, just = give your visitors a Google search box, and let them do a custom Google = search within your mostly static site ... Not only are most people already familiar with the way Google search = works (so they like this option) but this little trick also insures that = Google *WILL* index all your static web pages -- on a regular basis -- = probably once a day if your page headers tell Google that those pages = expire daily. Sincerely, Kenneth Grome --------------------------------------------------------- This message = is sent to you because you are subscribed to the mailing list . To = unsubscribe, E-mail to: archives: = http://mail.webdna.us/list/talk@webdna.us Bug Reporting: = support@webdna.us ------=_NextPart_000_0007_01CBBB92.B161B120 Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable
>Or=20 better yet, I would prefer to just write the results to a complete set = of static=20 web pages for the entire website.

 

This is exactly what I ended up doing, except that = I made the=20 results static includes and now include them in the site as opposed to = having so=20 many staic actual pages. This way, the purpose of the CMS isn't lost. In = doing=20 this, I also figured out how to make it so the friendly URLs can work = properly=20 and will be testing and implementing them today.

 

-Dan


From: Kenneth Grome
Sent: Monday, January 24, 2011 5:33 AM
To: talk@webdna.us
Subject: [WebDNA] Issues regarding static vs. dynamic=20 pages

>=20 >> every single word from each text of

>=20 >> each page is compared to a database in order to

>=20 >> create dynamic links: all the links from the websites

>=20 >> are in fact created "on the fly".

If=20 I had coded this type of feature I would not have done it this way -- = precisely=20 because it is slow and unnecessarily demanding of resources with every = hit to=20 the website.

Instead=20 I would create a script that processes "every single word from each text = of each=20 page" only once a day, and I would have this script write the resulting = text=20 with links to a new database.

Or=20 better yet, I would prefer to just write the results to a complete set = of static=20 web pages for the entire website. Then the server can deliver these = static pages=20 without requiring such a heavy load on WebDNA at the time each page is=20 requested.

The=20 end result is that the site could be made up of mostly static pages not = served=20 by WebDNA. Yet those static pages would still be dynamically generated = by WebDNA=20 once a day to make sure they are current and up-to-date every 24 = hours.

Naturally=20 the site admin should be able to run this script manually so he can = update these=20 pages "on demand" whenever he makes an important update that needs to be = displayed to visitors immediately.

I've=20 done this kind of thing for several client sites before so I know it = works --=20 and it works well. It makes the website very fast because most pages can = be=20 cached and gzipped by the server -- rather than being rendered by WebDNA = upon=20 each request -- thus avoiding unnecessarily heavy demands on WebDNA when = those=20 pages are requested.

And=20 by using a trigger to run my script in the middle of the night when = almost no=20 one is visiting, the heavy WebDNA processing is performed at the most = opportune=20 time in terms of server resources, so it never slows down the experience = of=20 daytime visitors unnecessarily.

Oh,=20 you say that with your site consisting of mostly static pages you cannot = use=20 WebDNA to let your visitors dynamically search your site? Don't worry = about it=20 because they couldn't care less what software powers the search. So = instead of=20 using WebDNA for site searches, just give your visitors a Google search = box, and=20 let them do a custom Google search within your mostly static site = ...

Not=20 only are most people already familiar with the way Google search works = (so they=20 like this option) but this little trick also insures that Google *WILL* = index=20 all your static web pages -- on a regular basis -- probably once a day = if your=20 page headers tell Google that those pages expire daily.

Sincerely,

Kenneth=20 Grome

---------------------------------------------------------=20 This message is sent to you because you are subscribed to the mailing = list=20 . To unsubscribe, E-mail to: = archives:=20 http://mail.webdna.us/list/talk@webdna.us Bug Reporting:=20 support@webdna.us ------=_NextPart_000_0007_01CBBB92.B161B120-- Associated Messages, from the most recent to the oldest:

    
  1. Re: [WebDNA] Issues regarding static vs. dynamic pages ("Dan Strong" 2011)
  2. Re: [WebDNA] Issues regarding static vs. dynamic pages (William DeVaul 2011)
  3. Re: [WebDNA] Issues regarding static vs. dynamic pages ("Dan Strong" 2011)
  4. [WebDNA] Issues regarding static vs. dynamic pages (Kenneth Grome 2011)
This is a multi-part message in MIME format. ------=_NextPart_000_0007_01CBBB92.B161B120 Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable >Or better yet, I would prefer to just write the results to a complete = set of static web pages for the entire website. This is exactly what I ended up doing, except that I made the results = static includes and now include them in the site as opposed to having so = many staic actual pages. This way, the purpose of the CMS isn't lost. In = doing this, I also figured out how to make it so the friendly URLs can = work properly and will be testing and implementing them today. -Dan From: Kenneth Grome=20 Sent: Monday, January 24, 2011 5:33 AM To: talk@webdna.us=20 Subject: [WebDNA] Issues regarding static vs. dynamic pages > >> every single word from each text of > >> each page is compared to a database in order to > >> create dynamic links: all the links from the websites > >> are in fact created "on the fly". If I had coded this type of feature I would not have done it this way -- = precisely because it is slow and unnecessarily demanding of resources = with every hit to the website.=20 Instead I would create a script that processes "every single word from = each text of each page" only once a day, and I would have this script = write the resulting text with links to a new database.=20 Or better yet, I would prefer to just write the results to a complete = set of static web pages for the entire website. Then the server can = deliver these static pages without requiring such a heavy load on WebDNA = at the time each page is requested. The end result is that the site could be made up of mostly static pages = not served by WebDNA. Yet those static pages would still be dynamically = generated by WebDNA once a day to make sure they are current and = up-to-date every 24 hours. Naturally the site admin should be able to run this script manually so = he can update these pages "on demand" whenever he makes an important = update that needs to be displayed to visitors immediately.=20 I've done this kind of thing for several client sites before so I know = it works -- and it works well. It makes the website very fast because = most pages can be cached and gzipped by the server -- rather than being = rendered by WebDNA upon each request -- thus avoiding unnecessarily = heavy demands on WebDNA when those pages are requested. And by using a trigger to run my script in the middle of the night when = almost no one is visiting, the heavy WebDNA processing is performed at = the most opportune time in terms of server resources, so it never slows = down the experience of daytime visitors unnecessarily. Oh, you say that with your site consisting of mostly static pages you = cannot use WebDNA to let your visitors dynamically search your site? = Don't worry about it because they couldn't care less what software = powers the search. So instead of using WebDNA for site searches, just = give your visitors a Google search box, and let them do a custom Google = search within your mostly static site ... Not only are most people already familiar with the way Google search = works (so they like this option) but this little trick also insures that = Google *WILL* index all your static web pages -- on a regular basis -- = probably once a day if your page headers tell Google that those pages = expire daily. Sincerely, Kenneth Grome --------------------------------------------------------- This message = is sent to you because you are subscribed to the mailing list . To = unsubscribe, E-mail to: archives: = http://mail.webdna.us/list/talk@webdna.us Bug Reporting: = support@webdna.us ------=_NextPart_000_0007_01CBBB92.B161B120 Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable
>Or=20 better yet, I would prefer to just write the results to a complete set = of static=20 web pages for the entire website.

 

This is exactly what I ended up doing, except that = I made the=20 results static includes and now include them in the site as opposed to = having so=20 many staic actual pages. This way, the purpose of the CMS isn't lost. In = doing=20 this, I also figured out how to make it so the friendly URLs can work = properly=20 and will be testing and implementing them today.

 

-Dan


Sent: Monday, January 24, 2011 5:33 AM
Subject: [WebDNA] Issues regarding static vs. dynamic=20 pages

>=20 >> every single word from each text of

>=20 >> each page is compared to a database in order to

>=20 >> create dynamic links: all the links from the websites

>=20 >> are in fact created "on the fly".

If=20 I had coded this type of feature I would not have done it this way -- = precisely=20 because it is slow and unnecessarily demanding of resources with every = hit to=20 the website.

Instead=20 I would create a script that processes "every single word from each text = of each=20 page" only once a day, and I would have this script write the resulting = text=20 with links to a new database.

Or=20 better yet, I would prefer to just write the results to a complete set = of static=20 web pages for the entire website. Then the server can deliver these = static pages=20 without requiring such a heavy load on WebDNA at the time each page is=20 requested.

The=20 end result is that the site could be made up of mostly static pages not = served=20 by WebDNA. Yet those static pages would still be dynamically generated = by WebDNA=20 once a day to make sure they are current and up-to-date every 24 = hours.

Naturally=20 the site admin should be able to run this script manually so he can = update these=20 pages "on demand" whenever he makes an important update that needs to be = displayed to visitors immediately.

I've=20 done this kind of thing for several client sites before so I know it = works --=20 and it works well. It makes the website very fast because most pages can = be=20 cached and gzipped by the server -- rather than being rendered by WebDNA = upon=20 each request -- thus avoiding unnecessarily heavy demands on WebDNA when = those=20 pages are requested.

And=20 by using a trigger to run my script in the middle of the night when = almost no=20 one is visiting, the heavy WebDNA processing is performed at the most = opportune=20 time in terms of server resources, so it never slows down the experience = of=20 daytime visitors unnecessarily.

Oh,=20 you say that with your site consisting of mostly static pages you cannot = use=20 WebDNA to let your visitors dynamically search your site? Don't worry = about it=20 because they couldn't care less what software powers the search. So = instead of=20 using WebDNA for site searches, just give your visitors a Google search = box, and=20 let them do a custom Google search within your mostly static site = ...

Not=20 only are most people already familiar with the way Google search works = (so they=20 like this option) but this little trick also insures that Google *WILL* = index=20 all your static web pages -- on a regular basis -- probably once a day = if your=20 page headers tell Google that those pages expire daily.

Sincerely,

Kenneth=20 Grome

---------------------------------------------------------=20 This message is sent to you because you are subscribed to the mailing = list=20 . To unsubscribe, E-mail to: = archives:=20 http://mail.webdna.us/list/talk@webdna.us Bug Reporting:=20 support@webdna.us ------=_NextPart_000_0007_01CBBB92.B161B120-- "Dan Strong"

DOWNLOAD WEBDNA NOW!

Top Articles:

Talk List

The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...

Related Readings:

Pull Down Search (2000) Lock out multiple users with same password? (1999) WebCat2b13MacPlugIn - More limits on [include] (1997) OT: JavaScript question (2001) Problem when Inputing text (1999) RE: [isfolder] and [filename] (1997) WebCatalog + WebMerchant 2.1 for Windows released (1998) Checking for WebCatalog Version (2000) New Mac Emailer... (1997) WebMerchant and Mac Auth Hub Help Please (1999) Custom WebCat Prefs ... (1997) Sorting a [listfiles] (2003) Satistics (1999) SKU lookup (1997) displaying New products (using [date]) (1997) PIXO with cometsite ... and/or other plugins (1998) Error: Too many nested [xxx] contexts (WebTen only) (1997) RE: WebCat virtual postcard is done! Thanks for the help! (1998) Major Security Hole IIS NT (1998) many-to-one problem (1998)