What I think I'd like to do is tie into the page not =found system,
i.e. have the server send my 404 request to instead of =error.html to URLs.tpl
that way all pages act as they do =currently BUT
for any "pretty" URL, it gets "not found" and rerouted =to URLs.tpl
and inside that file i want to do something like: =(yes this is 100% wrong, just typing outloud here)
= [showif =[url][thispage][/url]=3D[grep]('notebook_battery/$')[/grep]] = [include file=3Dalphamfg.tpl&_CID=3D2][/showif]
= [showif =[url][thispage][/url]=3D[grep]('('notebook_battery/(?P<MFG>\w+)/$')[=/grep]] [include =file=3Dpickmodel&_CID=3D2&_MFG=3D[MFG]][/showif]
= [showif =[url][thispage][/url]=3D[grep]('('notebook_battery/(?P<MFG>[^/]+)/(?=P<FID>\d+).*')[/grep]] [include =file=3Dmodelinfo&_CID=3D2&_MFG=3D[MFG]&_FID=3D[FID]][/showif]<=br>
(and have a last rule that actually redirects to a 404 =page...)
I want to figure out how to use an include so that i'm =specifically not rewriting and redirecting the URL (thus making it =ugly)
Anyone currently doing anything like this?
If I can =figure out how to do it, is anyone else interested in the =code?
Brian B. Burton
brian@burtons.com
= =3D=3D=3D==3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D==3D=3D=3D=3D=3D
= &n=bsp; time is precious. waste it =wisely
= =3D=3D=3D==3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D==3D=3D=3D=3D=3D
On Apr 27, 2011, at 6:22 AM, William =DeVaul wrote:On Tue, Apr 26, 2011 at =11:38 PM, Kenneth Grome <kengrome@gmail.com> =wrote:I'm not sure why you'd leave the =humans ugly URLs.Because those URLs are the =default URLs for WebDNA.I thought the =parameterized URLs were a convention that came about =inthe early days of the =Internet. Seems the convention is ripe =forchange.In general, I'm =for programmer convenience versus optimization for =thecomputer. But I'd =put user convenience above the programmer's. = Insome frameworks the =default is "prettier" to the benefit of users =andprogrammers.The search engines like =keywords.They get plenty of keywords in =the static pages.I think it is =about quality of the keyword placement (in =incominglinks, in the domain, =in the URLs, in "important" tags e.g. =<h1>).
-------------------------------------=--------------------
This message is sent to you because you are =subscribed to
the mailing list <talk@webdna.us>.
To =unsubscribe, E-mail to: <talk-leave@webdna.us>
archives: =http://mail.webdna.us/list/talk@webdna.us
Bug Reporting: =support@webdna.us
|
What I think I'd like to do is tie into the page not =found system,
i.e. have the server send my 404 request to instead of =error.html to URLs.tpl
that way all pages act as they do =currently BUT
for any "pretty" URL, it gets "not found" and rerouted =to URLs.tpl
and inside that file i want to do something like: =(yes this is 100% wrong, just typing outloud here)
= [showif =[url][thispage][/url]=3D[grep]('notebook_battery/$')[/grep]] = [include file=3Dalphamfg.tpl&_CID=3D2][/showif]
= [showif =[url][thispage][/url]=3D[grep]('('notebook_battery/(?P<MFG>\w+)/$')[=/grep]] [include =file=3Dpickmodel&_CID=3D2&_MFG=3D[MFG]][/showif]
= [showif =[url][thispage][/url]=3D[grep]('('notebook_battery/(?P<MFG>[^/]+)/(?=P<FID>\d+).*')[/grep]] [include =file=3Dmodelinfo&_CID=3D2&_MFG=3D[MFG]&_FID=3D[FID]][/showif]<=br>
(and have a last rule that actually redirects to a 404 =page...)
I want to figure out how to use an include so that i'm =specifically not rewriting and redirecting the URL (thus making it =ugly)
Anyone currently doing anything like this?
If I can =figure out how to do it, is anyone else interested in the =code?
Brian B. Burton
brian@burtons.com
= =3D=3D=3D==3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D==3D=3D=3D=3D=3D
= &n=bsp; time is precious. waste it =wisely
= =3D=3D=3D==3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D==3D=3D=3D=3D=3D
On Apr 27, 2011, at 6:22 AM, William =DeVaul wrote:On Tue, Apr 26, 2011 at =11:38 PM, Kenneth Grome <kengrome@gmail.com> =wrote:I'm not sure why you'd leave the =humans ugly URLs.Because those URLs are the =default URLs for WebDNA.I thought the =parameterized URLs were a convention that came about =inthe early days of the =Internet. Seems the convention is ripe =forchange.In general, I'm =for programmer convenience versus optimization for =thecomputer. But I'd =put user convenience above the programmer's. = Insome frameworks the =default is "prettier" to the benefit of users =andprogrammers.The search engines like =keywords.They get plenty of keywords in =the static pages.I think it is =about quality of the keyword placement (in =incominglinks, in the domain, =in the URLs, in "important" tags e.g. =<h1>).
-------------------------------------=--------------------
This message is sent to you because you are =subscribed to
the mailing list <talk@webdna.us>.
To =unsubscribe, E-mail to: <talk-leave@webdna.us>
archives: =http://mail.webdna.us/list/talk@webdna.us
Bug Reporting: =support@webdna.us
DOWNLOAD WEBDNA NOW!
The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...