texte = 1946--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70Content-Transfer-Encoding: quoted-printableContent-Type: text/plain;charset=utf-8I did a similar thing a while ago on 2 high traffic sitesBasically the concept was this. Every page had a 3 level deep menu =pulldowns. Those menus were driven by a backend CMS and stored across =various tables. The site was serving well over 1 million page views a =month and I was starting to feel the weight of it in performance.keep in mind that each pulldown was a recursive search for each level so =it really added up.Instead, I retooled the CMS so that the result of adding, deleting or =editing any menu in the admin resulted in a menu being written =completely to a single include file. This put the work of the recursive =searching on the backend and only when a change was needed.It made an immediate difference in page performance.Later, this was done on another site with similar traffic that was all =driven by SQL connections. Because of the ODBC performance hit, speed =was improved even more dramatically. This was especially true when the =tables had 1.5 million records in them.If you=E2=80=99re unsure of the impact throw an elapsedtime tag on the =page and you will instantly know just how much performance you squeezed =out of the system.HTHAlex> On Jul 3, 2018, at 2:24 PM, Lawrence Banahan
=wrote:>=20> I was more thinking of something like a CMS, with the engine Online.> Doesn't it make sense to have the content that change one a year to be =in static pages?> Wouldn't it be faster than having Webdna in the middle?> I'm working also on Wordpress websites, and it's so slow... That's how =I came through my searchs on some websites using static pages.--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70Content-Transfer-Encoding: quoted-printableContent-Type: text/html;charset=utf-8I =did a similar thing a while ago on 2 high traffic sites
Basically the concept was this. Every =page had a 3 level deep menu pulldowns. Those menus were driven by a =backend CMS and stored across various tables. The site was serving well =over 1 million page views a month and I was starting to feel the weight =of it in performance.
keep in mind that each pulldown was a recursive search for =each level so it really added up.
Instead, I retooled the CMS so that the =result of adding, deleting or editing any menu in the admin resulted in =a menu being written completely to a single include file. This put the =work of the recursive searching on the backend and only when a change =was needed.
It =made an immediate difference in page performance.
Later, this was done on another site =with similar traffic that was all driven by SQL connections. Because of =the ODBC performance hit, speed was improved even more dramatically. =This was especially true when the tables had 1.5 million records in =them.
If you=E2=80=99re unsure of the impact =throw an elapsedtime tag on the page and you will instantly know just =how much performance you squeezed out of the system.
HTH
Alex
I was =more thinking of something like a CMS, with the engine Online.
Doesn't= it make sense to have the content that change one a year to be in =static pages?
Wouldn't it be faster than having Webdna in the =middle?
I'm working also on Wordpress websites, and it's so =slow... That's how I came through my searchs on some websites using =static pages.
=---------------------------------------------------------This message is sent to you because you are subscribed tothe mailing list talk@webdna.usTo unsubscribe, E-mail to: talk-leave@webdna.usarchives: http://www.webdna.us/page.dna?numero=3D55Bug Reporting: support@webdna.us--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70--.
Associated Messages, from the most recent to the oldest:
1946--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70Content-Transfer-Encoding: quoted-printableContent-Type: text/plain;charset=utf-8I did a similar thing a while ago on 2 high traffic sitesBasically the concept was this. Every page had a 3 level deep menu =pulldowns. Those menus were driven by a backend CMS and stored across =various tables. The site was serving well over 1 million page views a =month and I was starting to feel the weight of it in performance.keep in mind that each pulldown was a recursive search for each level so =it really added up.Instead, I retooled the CMS so that the result of adding, deleting or =editing any menu in the admin resulted in a menu being written =completely to a single include file. This put the work of the recursive =searching on the backend and only when a change was needed.It made an immediate difference in page performance.Later, this was done on another site with similar traffic that was all =driven by SQL connections. Because of the ODBC performance hit, speed =was improved even more dramatically. This was especially true when the =tables had 1.5 million records in them.If you=E2=80=99re unsure of the impact throw an elapsedtime tag on the =page and you will instantly know just how much performance you squeezed =out of the system.HTHAlex> On Jul 3, 2018, at 2:24 PM, Lawrence Banahan =wrote:>=20> I was more thinking of something like a CMS, with the engine Online.> Doesn't it make sense to have the content that change one a year to be =in static pages?> Wouldn't it be faster than having Webdna in the middle?> I'm working also on Wordpress websites, and it's so slow... That's how =I came through my searchs on some websites using static pages.--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70Content-Transfer-Encoding: quoted-printableContent-Type: text/html;charset=utf-8I =did a similar thing a while ago on 2 high traffic sites
Basically the concept was this. Every =page had a 3 level deep menu pulldowns. Those menus were driven by a =backend CMS and stored across various tables. The site was serving well =over 1 million page views a month and I was starting to feel the weight =of it in performance.
keep in mind that each pulldown was a recursive search for =each level so it really added up.
Instead, I retooled the CMS so that the =result of adding, deleting or editing any menu in the admin resulted in =a menu being written completely to a single include file. This put the =work of the recursive searching on the backend and only when a change =was needed.
It =made an immediate difference in page performance.
Later, this was done on another site =with similar traffic that was all driven by SQL connections. Because of =the
ODBC performance hit, speed was improved even more dramatically. =This was especially true when the tables had 1.5 million records in =them.
If you=E2=80=99re unsure of the impact =throw an elapsedtime tag on the page and you will instantly know just =how much performance you squeezed out of the system.
HTH
Alex
I was =more thinking of something like a CMS, with the engine Online.
Doesn't= it make sense to have the content that change one a year to be in =static pages?
Wouldn't it be faster than having Webdna in the =middle?
I'm working also on Wordpress websites, and it's so =slow... That's how I came through my searchs on some websites using =static pages.
=---------------------------------------------------------This message is sent to you because you are subscribed tothe mailing list talk@webdna.usTo unsubscribe, E-mail to: talk-leave@webdna.usarchives: http://www.webdna.us/page.dna?numero=3D55Bug Reporting: support@webdna.us--Apple-Mail=_42AD59E3-B43F-47BA-BAA1-E9732C2DAA70--.
Alex Mccombie
DOWNLOAD WEBDNA NOW!
Top Articles:
Talk List
The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...
Related Readings:
Norton Internet Security filtering out WebDNA processsed (2004)
[WebDNA] sudo and shell (2010)
Extended [ConvertChars] (1997)
RE:It just Does't add up!!! (1997)
problems with 2 tags (1997)
RE: WebDNA-Talk searchable? (1997)
creator code (1997)
One more try (1997)
WebCat & Liststar ... & Rumpus, too! (1997)
2.0Beta Command Ref (can't find this instruction) (1997)
maximu values for sendmail! (1997)
access denied problem (1997)
Multiple SSL Keys (1998)
Re[2]: Enhancement Request for WebCatalog-NT (1996)
carriage returns in data (1997)
Progress !! WAS: Trouble with formula.db (1997)
matching shipto and others (1998)
[WebDNA] Can webDNA be forced to be able to read ANY file in ANY folder in a webroot in =?UTF-8?Q?CentOS=3F?= (2018)
Separate SSL Server (1997)
Search context not finding recent entries (1998)