Re: [WebDNA] Issues regarding static vs. dynamic pages
This WebDNA talk-list message is from 2011
It keeps the original formatting.
numero = 106151
interpreted = N
texte = This is a multi-part message in MIME format.------=_NextPart_000_0007_01CBBB92.B161B120Content-Type: text/plain;charset="iso-8859-1"Content-Transfer-Encoding: quoted-printable>Or better yet, I would prefer to just write the results to a complete =set of static web pages for the entire website.This is exactly what I ended up doing, except that I made the results =static includes and now include them in the site as opposed to having so =many staic actual pages. This way, the purpose of the CMS isn't lost. In =doing this, I also figured out how to make it so the friendly URLs can =work properly and will be testing and implementing them today.-DanFrom: Kenneth Grome=20Sent: Monday, January 24, 2011 5:33 AMTo: talk@webdna.us=20Subject: [WebDNA] Issues regarding static vs. dynamic pages> >> every single word from each text of> >> each page is compared to a database in order to> >> create dynamic links: all the links from the websites> >> are in fact created "on the fly".If I had coded this type of feature I would not have done it this way -- =precisely because it is slow and unnecessarily demanding of resources =with every hit to the website.=20Instead I would create a script that processes "every single word from =each text of each page" only once a day, and I would have this script =write the resulting text with links to a new database.=20Or better yet, I would prefer to just write the results to a complete =set of static web pages for the entire website. Then the server can =deliver these static pages without requiring such a heavy load on WebDNA =at the time each page is requested.The end result is that the site could be made up of mostly static pages =not served by WebDNA. Yet those static pages would still be dynamically =generated by WebDNA once a day to make sure they are current and =up-to-date every 24 hours.Naturally the site admin should be able to run this script manually so =he can update these pages "on demand" whenever he makes an important =update that needs to be displayed to visitors immediately.=20I've done this kind of thing for several client sites before so I know =it works -- and it works well. It makes the website very fast because =most pages can be cached and gzipped by the server -- rather than being =rendered by WebDNA upon each request -- thus avoiding unnecessarily =heavy demands on WebDNA when those pages are requested.And by using a trigger to run my script in the middle of the night when =almost no one is visiting, the heavy WebDNA processing is performed at =the most opportune time in terms of server resources, so it never slows =down the experience of daytime visitors unnecessarily.Oh, you say that with your site consisting of mostly static pages you =cannot use WebDNA to let your visitors dynamically search your site? =Don't worry about it because they couldn't care less what software =powers the search. So instead of using WebDNA for site searches, just =give your visitors a Google search box, and let them do a custom Google =search within your mostly static site ...Not only are most people already familiar with the way Google search =works (so they like this option) but this little trick also insures that =Google *WILL* index all your static web pages -- on a regular basis -- =probably once a day if your page headers tell Google that those pages =expire daily.Sincerely,Kenneth Grome--------------------------------------------------------- This message =is sent to you because you are subscribed to the mailing list . To =unsubscribe, E-mail to: archives: =http://mail.webdna.us/list/talk@webdna.us Bug Reporting: =support@webdna.us------=_NextPart_000_0007_01CBBB92.B161B120Content-Type: text/html;charset="iso-8859-1"Content-Transfer-Encoding: quoted-printable
>Or=20better yet, I would prefer to just write the results to a complete set =of static=20web pages for the entire website.
This is exactly what I ended up doing, except that =I made the=20results static includes and now include them in the site as opposed to =having so=20many staic actual pages. This way, the purpose of the CMS isn't lost. In =doing=20this, I also figured out how to make it so the friendly URLs can work =properly=20and will be testing and implementing them today.
-Dan
Sent: Monday, January 24, 2011 5:33 AM
Subject: [WebDNA] Issues regarding static vs. dynamic=20pages
>=20>> every single word from each text of
>=20>> each page is compared to a database in order to
>=20>> create dynamic links: all the links from the websites
>=20>> are in fact created "on the fly".
If=20I had coded this type of feature I would not have done it this way -- =precisely=20because it is slow and unnecessarily demanding of resources with every =hit to=20the website.
Instead=20I would create a script that processes "every single word from each text =of each=20page" only once a day, and I would have this script write the resulting =text=20with links to a new database.
Or=20better yet, I would prefer to just write the results to a complete set =of static=20web pages for the entire website. Then the server can deliver these =static pages=20without requiring such a heavy load on WebDNA at the time each page is=20requested.
The=20end result is that the site could be made up of mostly static pages not =served=20by WebDNA. Yet those static pages would still be dynamically generated =by WebDNA=20once a day to make sure they are current and up-to-date every 24 =hours.
Naturally=20the site admin should be able to run this script manually so he can =update these=20pages "on demand" whenever he makes an important update that needs to be =displayed to visitors immediately.
I've=20done this kind of thing for several client sites before so I know it =works --=20and it works well. It makes the website very fast because most pages can =be=20cached and gzipped by the server -- rather than being rendered by WebDNA =upon=20each request -- thus avoiding unnecessarily heavy demands on WebDNA when =those=20pages are requested.
And=20by using a trigger to run my script in the middle of the night when =almost no=20one is visiting, the heavy WebDNA processing is performed at the most =opportune=20time in terms of server resources, so it never slows down the experience =of=20daytime visitors unnecessarily.
Oh,=20you say that with your site consisting of mostly static pages you cannot =use=20WebDNA to let your visitors dynamically search your site? Don't worry =about it=20because they couldn't care less what software powers the search. So =instead of=20using WebDNA for site searches, just give your visitors a Google search =box, and=20let them do a custom Google search within your mostly static site =...
Not=20only are most people already familiar with the way Google search works =(so they=20like this option) but this little trick also insures that Google *WILL* =index=20all your static web pages -- on a regular basis -- probably once a day =if your=20page headers tell Google that those pages expire daily.
Sincerely,
Kenneth=20Grome
---------------------------------------------------------=20This message is sent to you because you are subscribed to the mailing =list=20
. To unsubscribe, E-mail to: =archives:=20http://mail.webdna.us/list/talk@webdna.us Bug Reporting:=20support@webdna.us------=_NextPart_000_0007_01CBBB92.B161B120--
Associated Messages, from the most recent to the oldest:
This is a multi-part message in MIME format.------=_NextPart_000_0007_01CBBB92.B161B120Content-Type: text/plain;charset="iso-8859-1"Content-Transfer-Encoding: quoted-printable>Or better yet, I would prefer to just write the results to a complete =set of static web pages for the entire website.This is exactly what I ended up doing, except that I made the results =static includes and now include them in the site as opposed to having so =many staic actual pages. This way, the purpose of the CMS isn't lost. In =doing this, I also figured out how to make it so the friendly URLs can =work properly and will be testing and implementing them today.-DanFrom: Kenneth Grome=20Sent: Monday, January 24, 2011 5:33 AMTo: talk@webdna.us=20Subject: [WebDNA] Issues regarding static vs. dynamic pages> >> every single word from each text of> >> each page is compared to a database in order to> >> create dynamic links: all the links from the websites> >> are in fact created "on the fly".If I had coded this type of feature I would not have done it this way -- =precisely because it is slow and unnecessarily demanding of resources =with every hit to the website.=20Instead I would create a script that processes "every single word from =each text of each page" only once a day, and I would have this script =write the resulting text with links to a new database.=20Or better yet, I would prefer to just write the results to a complete =set of static web pages for the entire website. Then the server can =deliver these static pages without requiring such a heavy load on WebDNA =at the time each page is requested.The end result is that the site could be made up of mostly static pages =not served by WebDNA. Yet those static pages would still be dynamically =generated by WebDNA once a day to make sure they are current and =up-to-date every 24 hours.Naturally the site admin should be able to run this script manually so =he can update these pages "on demand" whenever he makes an important =update that needs to be displayed to visitors immediately.=20I've done this kind of thing for several client sites before so I know =it works -- and it works well. It makes the website very fast because =most pages can be cached and gzipped by the server -- rather than being =rendered by WebDNA upon each request -- thus avoiding unnecessarily =heavy demands on WebDNA when those pages are requested.And by using a trigger to run my script in the middle of the night when =almost no one is visiting, the heavy WebDNA processing is performed at =the most opportune time in terms of server resources, so it never slows =down the experience of daytime visitors unnecessarily.Oh, you say that with your site consisting of mostly static pages you =cannot use WebDNA to let your visitors dynamically search your site? =Don't worry about it because they couldn't care less what software =powers the search. So instead of using WebDNA for site searches, just =give your visitors a Google search box, and let them do a custom Google =search within your mostly static site ...Not only are most people already familiar with the way Google search =works (so they like this option) but this little trick also insures that =Google *WILL* index all your static web pages -- on a regular basis -- =probably once a day if your page headers tell Google that those pages =expire daily.Sincerely,Kenneth Grome--------------------------------------------------------- This message =is sent to you because you are subscribed to the mailing list . To =unsubscribe, E-mail to: archives: =http://mail.webdna.us/list/talk@webdna.us Bug Reporting: =support@webdna.us------=_NextPart_000_0007_01CBBB92.B161B120Content-Type: text/html;charset="iso-8859-1"Content-Transfer-Encoding: quoted-printable>Or=20better yet, I would prefer to just write the results to a complete set =of static=20web pages for the entire website.
This is exactly what I ended up doing, except that =I made the=20results static includes and now include them in the site as opposed to =having so=20many staic actual pages. This way, the purpose of the CMS isn't lost. In =doing=20this, I also figured out how to make it so the friendly URLs can work =properly=20and will be testing and implementing them today.
-Dan
Sent: Monday, January 24, 2011 5:33 AM
Subject: [WebDNA] Issues regarding static vs. dynamic=20pages
>=20>> every single word from each text of
>=20>> each page is compared to a database in order to
>=20>> create dynamic links: all the links from the websites
>=20>> are in fact created "on the fly".
If=20I had coded this type of feature I would not have done it this way -- =precisely=20because it is slow and unnecessarily demanding of resources with every =hit to=20the website.
Instead=20I would create a script that processes "every single word from each text =of each=20page" only once a day, and I would have this script write the resulting =text=20with links to a new database.
Or=20better yet, I would prefer to just write the results to a complete set =of static=20web pages for the entire website. Then the server can deliver these =static pages=20without requiring such a heavy load on WebDNA at the time each page is=20requested.
The=20end result is that the site could be made up of mostly static pages not =served=20by WebDNA. Yet those static pages would still be dynamically generated =by WebDNA=20once a day to make sure they are current and up-to-date every 24 =hours.
Naturally=20the site admin should be able to run this script manually so he can =update these=20pages "on demand" whenever he makes an important update that needs to be =displayed to visitors immediately.
I've=20done this kind of thing for several client sites before so I know it =works --=20and it works well. It makes the website very fast because most pages can =be=20cached and gzipped by the server -- rather than being rendered by WebDNA =upon=20each request -- thus avoiding unnecessarily heavy demands on WebDNA when =those=20pages are requested.
And=20by using a trigger to run my script in the middle of the night when =almost no=20one is visiting, the heavy WebDNA processing is performed at the most =opportune=20time in terms of server resources, so it never slows down the experience =of=20daytime visitors unnecessarily.
Oh,=20you say that with your site consisting of mostly static pages you cannot =use=20WebDNA to let your visitors dynamically search your site? Don't worry =about it=20because they couldn't care less what software powers the search. So =instead of=20using WebDNA for site searches, just give your visitors a Google search =box, and=20let them do a custom Google search within your mostly static site =...
Not=20only are most people already familiar with the way Google search works =(so they=20like this option) but this little trick also insures that Google *WILL* =index=20all your static web pages -- on a regular basis -- probably once a day =if your=20page headers tell Google that those pages expire daily.
Sincerely,
Kenneth=20Grome
---------------------------------------------------------=20This message is sent to you because you are subscribed to the mailing =list=20. To unsubscribe, E-mail to: =archives:=20http://mail.webdna.us/list/talk@webdna.us Bug Reporting:=20support@webdna.us------=_NextPart_000_0007_01CBBB92.B161B120--
"Dan Strong"
DOWNLOAD WEBDNA NOW!
Top Articles:
Talk List
The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...
Related Readings:
Pull Down Search (2000)
Lock out multiple users with same password? (1999)
WebCat2b13MacPlugIn - More limits on [include] (1997)
OT: JavaScript question (2001)
Problem when Inputing text (1999)
RE: [isfolder] and [filename] (1997)
WebCatalog + WebMerchant 2.1 for Windows released (1998)
Checking for WebCatalog Version (2000)
New Mac Emailer... (1997)
WebMerchant and Mac Auth Hub Help Please (1999)
Custom WebCat Prefs ... (1997)
Sorting a [listfiles] (2003)
Satistics (1999)
SKU lookup (1997)
displaying New products (using [date]) (1997)
PIXO with cometsite ... and/or other plugins (1998)
Error: Too many nested [xxx] contexts (WebTen only) (1997)
RE: WebCat virtual postcard is done! Thanks for the help! (1998)
Major Security Hole IIS NT (1998)
many-to-one problem (1998)