Re: [WebDNA] END processing

This WebDNA talk-list message is from

2014


It keeps the original formatting.
numero = 111310
interpreted = N
texte = Just out of curiosity, have you thought of how this will effect the Googlebot spider? This sounds like the classic definition of cloaking to me. You might want to add an extra test in your [include] that looks up the user agent and allows msnbot bingbot, yahoobot, and googlebot to access all the content regardless of your counter. I'm all for zapping competitors taking your info, but don't accidentally zap yourself from Google search in the process. -Matt On 4/22/2014 12:17 PM, Brian Burton wrote: > So here’s a weird situation. > > I have a website in the replacement parts business that has extensive cross reference info on it. It requires a full time employee to maintain the data. A lot of competitors, trying to save a buck, prefer to copy our data rather then do the research themselves. So I developed code that counts the number of page views and after a point cuts off access to the site. As part of an effort to both amuse myself and be “helpful” to competitors that send automated spiders to steal the website, when the cutoff happens I start feeding bogus data as “valid” pages to them. :) This all happens as part of an include file at the top of every page that logs, counts, and issues the redirect to the bogus URL if needed. > > So here’s my next challenge: I don’t want to redirect the bad visitors. The redirect is noticeable, and the new URL gives away that it’s fake data. I want to (via an include file) build up a page of fake data on the URL they requested, and END all further processing on the page of the legitimate stuff that would happen below the include. Building a hideif would be troublesome due to the complexity of the code on all the pages that this would have to happen on. > > Thoughts? suggestions? > > Thanks! > Brian > --------------------------------------------------------- > This message is sent to you because you are subscribed to > the mailing list . > To unsubscribe, E-mail to: > archives: http://mail.webdna.us/list/talk@webdna.us > Bug Reporting: support@webdna.us > > -- Matthew A Perosi Corporate Consultant Mobile Marketing Expert Senior Web Developer SEO Analyst & Educator matt@psiprime.com Psi Prime, Inc. 323 Union Blvd. Totowa, NJ 07512 Direct: 888.872.0274 Fax: 888.488.5924 http://www.perosi.com Associated Messages, from the most recent to the oldest:

    
  1. Re: [WebDNA] END processing (Brian Burton 2014)
  2. Re: [WebDNA] END processing ("Psi Prime Inc, Matthew A Perosi " 2014)
  3. Re: [WebDNA] END processing (Brian Burton 2014)
  4. Re: [WebDNA] END processing (Tom Duke 2014)
  5. [WebDNA] END processing (Brian Burton 2014)
Just out of curiosity, have you thought of how this will effect the Googlebot spider? This sounds like the classic definition of cloaking to me. You might want to add an extra test in your [include] that looks up the user agent and allows msnbot bingbot, yahoobot, and googlebot to access all the content regardless of your counter. I'm all for zapping competitors taking your info, but don't accidentally zap yourself from Google search in the process. -Matt On 4/22/2014 12:17 PM, Brian Burton wrote: > So here’s a weird situation. > > I have a website in the replacement parts business that has extensive cross reference info on it. It requires a full time employee to maintain the data. A lot of competitors, trying to save a buck, prefer to copy our data rather then do the research themselves. So I developed code that counts the number of page views and after a point cuts off access to the site. As part of an effort to both amuse myself and be “helpful” to competitors that send automated spiders to steal the website, when the cutoff happens I start feeding bogus data as “valid” pages to them. :) This all happens as part of an include file at the top of every page that logs, counts, and issues the redirect to the bogus URL if needed. > > So here’s my next challenge: I don’t want to redirect the bad visitors. The redirect is noticeable, and the new URL gives away that it’s fake data. I want to (via an include file) build up a page of fake data on the URL they requested, and END all further processing on the page of the legitimate stuff that would happen below the include. Building a hideif would be troublesome due to the complexity of the code on all the pages that this would have to happen on. > > Thoughts? suggestions? > > Thanks! > Brian > --------------------------------------------------------- > This message is sent to you because you are subscribed to > the mailing list . > To unsubscribe, E-mail to: > archives: http://mail.webdna.us/list/talk@webdna.us > Bug Reporting: support@webdna.us > > -- Matthew A Perosi Corporate Consultant Mobile Marketing Expert Senior Web Developer SEO Analyst & Educator matt@psiprime.com Psi Prime, Inc. 323 Union Blvd. Totowa, NJ 07512 Direct: 888.872.0274 Fax: 888.488.5924 http://www.perosi.com "Psi Prime Inc, Matthew A Perosi "

DOWNLOAD WEBDNA NOW!

Top Articles:

Talk List

The WebDNA community talk-list is the best place to get some help: several hundred extremely proficient programmers with an excellent knowledge of WebDNA and an excellent spirit will deliver all the tips and tricks you can imagine...

Related Readings:

[include ...] behavior (1997) find with exceptions (1997) read and write you own cookies with webcat (1997) Limitations on fields? Server is crashing (1997) PSC recommends what date format yr 2000??? (1997) How can I Add several Items into the cart at once? (1997) Speeding up my [showif] performance (1999) Hierarchy of form/text/math variables (2000) [WebDNA] multi [sendmail] inside [search] (2012) [addlineitems] display (1997) [WebDNA] Secure Cookies (2020) Card clearance, problems - solutions? (1997) supressing math results (1997) Middle Context (2002) [WebDNA] Webcat 6 - MacIntel - iTools (2008) OFF-TOPIC: Check www.godaddy.com for me ... (2003) Emailer Problem (1999) Fun with Dates - finally resolved but.... (1997) Exclamation point (1997) [Announce]: Web server security and password protection (1997)