In 2008, I founded Build Interactive, a small Web Design studio specializing in website design, development, and mobile applications. I've had the privilege of working with many world-class brands, including multiple Fortune 500 companies, and hand-coded over 200 projects.
I’m noticing some odd looking user agent details in server logs that vary from the typical mozilla, chrome, and other expected browsers. Is there any normal reason for a user agent to be python-requests or python-urllib? Are these type of traffic hits usually bad scrapers, or can they be “good” crawlers that shouldn’t be blocked?
I’m currently using Visual Studio Code version 1.71.0 and an error I’ve been seeing for awhile with PHP files is: Cannot validate since a PHP installation could not be found. Use the setting ‘php.validate.executablePath’ to configure the PHP executable. What is the quick fix to get this to go away?