/    Sign up×
Community /Pin to ProfileBookmark

Dynamic pages not being indexed

Hello all,

this is an index of sub-pages, with dynamic URLs:

  • [url]http://www.oeko-fakt.de/produkte/details/[/url]
  • An array of real “dofollow” links is rendered to the search engine, but these sub-pages are not being indexed.
    Any idea why?

    Thanks in advance and
    Kind regards

    to post a comment
    SEO

    17 Comments(s)

    Copy linkTweet thisAlerts:
    @RoopatgSep 23.2014 — Make your website URLs to search engine friendly. A URL to be readable. And I think you need to check your sitemap once, and set the frequency to daily,but not yearly. Try submitting sitemap.xml file.
    Copy linkTweet thisAlerts:
    @arvgtaauthorSep 23.2014 — Thanks!

    The URLs are all in this format:

  • - http://www.oeko-fakt.de/produkte/details/?pid=alsecco-alprotect-aero


  • Is that not search engine friendly.

    I've now also installed a WP-plugin "XML sitemaps" that was missing, but don't know, whether it's smart enough to recognise the above links...

    EDIT: I could see in the sitemap.xml that the update frequency was "yearly".

    Unfortunately, the XML Sitemaps tool doesn't allow a change of frequency.

    I read though, that it pings the file frequently, so I've deleted the sitemap.xml, hoping that it will be updated...

    Am eager to see, whether it's smart enough to recognise dynamic pages...

    Thanks very much for your reply!
    Copy linkTweet thisAlerts:
    @RoopatgSep 24.2014 — I dont think so it is a search engine frinedly, try to have a link URL without ? or = .So that it can be easily readable by search engines.

    There are so many tools available to generate a sitemap file, and where you can edit the file. I feel it would be good if you can set the frequency to Daily.
    Copy linkTweet thisAlerts:
    @MasondavisSep 24.2014 — And I think you need to check your sitemap once, and set the frequency to daily,but not yearly. Try submitting Sitemap.xml file.
    Copy linkTweet thisAlerts:
    @arvgtaauthorSep 24.2014 — Thanks again for your reply.

    A new sitemap.xml has been re-generated, unfortunately with those dynamic URLs missing. So the WP XML Sitemaps tool is not smart enough to generate the dynamic URLs. This would be the first step...

    What's the correct WP way of doing this?


    Thanks and

    Kind regards
    Copy linkTweet thisAlerts:
    @arvgtaauthorSep 24.2014 — Hi Masondavis,

    thanks for your contribution!

    There's no option in the WP XML Sitemaps for setting the frequency of updates.

    However, when the sitemap.xml is deleted it gets re-generated quite quickly, as was so last night.

    The real problem is that the generator does not recognise dynamic URLs, on a special WP page template.

    I am thinking of completely chassing the WP XML Sitemap generator and deleting the sitemap.xml for good? Then Google would probably index the sub-pages...

    Or alternatively, it would be interesting to employ a WP plugin, that is a bit smarter? I've googled around but didn't find anything like it...
    Copy linkTweet thisAlerts:
    @RoopatgSep 24.2014 — Isn't it possible to change your dynamic URLs into static in any way?

    And for sitemap for dynamic URLs, I tried for creating sitemap on one sitemap generator tool, It could create for dynamic URLs too.

    Please check the link: http://www.xml-sitemaps.com/details-oeko-fakt.14667139.html, there you can download your sitemap.xml file, where your website dynamic URLs also added.
    Copy linkTweet thisAlerts:
    @arvgtaauthorSep 24.2014 — Hi Roopatg!

    Excellent! Thanks very much!

    The dynamic URLs are recognised by http://www.xml-sitemaps.com/ !

    I've chucked out the XML Sitemap generator.

    I've also created a Google Webmaster Tools entry for http://www.oeko-fakt.de/ and submitted the uploaded sitemap.xml.

    Google seems to happy, the status is "pending", but I trust these pages will be spidered shortly.

    I'll report back, as soon as they're spidered...

    Thanks a million for your effort!
    Copy linkTweet thisAlerts:
    @RoopatgSep 24.2014 — Ok, sitemap might be processed within 24hrs, keep checking in Google webmaster tools to know about status of your sitemap submission. And I hope you are done with all other on-page strategies to get more visibility on search engines.All the best.
    Copy linkTweet thisAlerts:
    @arvgtaauthorSep 24.2014 — You're a star! All the best, too!
    Copy linkTweet thisAlerts:
    @arvgtaauthorNov 01.2014 — Hello all,

    unfortunately, over a month later, the problem persists.

    Here's a link to the sitemap: http://www.oeko-fakt.de/sitemap.xml

    (I've set the update frequency to "daily" everywhere, and re-submitted to Google)

    So one culprit left is the URL structure, they are like this:

    http://www.oeko-fakt.de/produkte/details/?pid=alsecco-alprotect-aero

    or

    http://www.oeko-fakt.de/produkte/details/?pid=renault-zoe

    Are the above URLs valid?

    What else could be the problem?


    Thanks in advance

    Kind regards
    Copy linkTweet thisAlerts:
    @arvgtaauthorNov 01.2014 — EDIT: Found this: http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/

    ...which indicates, that the URLs are not the problem

    (i.e. they should be crawled, even if not ideal)


    Any other ideas?
    Copy linkTweet thisAlerts:
    @deathshadowNov 01.2014 — Uhm... from what I'm seeing every one of your anchors takes me to the exact same page just with a different URL/request. I'm not seeing any unique content anywhere on any of the pages? What exactly are you expecting search to do other than pimp-slap you for duplicate content?

    Wait, do you have some form of scripttardery screwing with the anchors or something? I only get a different page if I click on the anchor (changing the URL) and then do a refresh... I bet that's what's happening; If my following your anchors isn't taking me to a different page, I bet the search engine is having the exact same issues...

    Confirmed, disabling/blocking JS actually takes me to new pages. I'd say that Google's recent (past three to four years) changes of trying to obey when JS screws with navigation and/or generates content is what's making the page fail.

    Really this is just another train wreck of how not to build a website; absolute URL's for nothing, endless pointless DIV and classes for nothing, and worst of all endless pointless code-bloat JS for nothing.

    If I were to take a wild guess, I'd probably say it's that "ajaxify" nonsense mucking with you... which is the typical "I can haz intarnets" scripting garbage that wouldn't be needed on such a simple site if the markup, css, and other scripting wasn't such a bloated mess.
    Copy linkTweet thisAlerts:
    @arvgtaauthorNov 02.2014 — I'm usually grateful for most any answer but your's is so exaggeratedly negative, that I can't take it seriously as a whole...

    Anybody else?
    Copy linkTweet thisAlerts:
    @deathshadowNov 03.2014 — I'm usually grateful for most any answer but your's is so exaggeratedly negative, that I can't take it seriously as a whole...[/QUOTE]
    Opera 12, Opera Next, Firefox -- try navigating from page to page, the URL changes in the address bar, the page contents DO NOT CHANGE. Turn scripting off, the page works.

    If you're not willing to accept that the scripting is your problem, you apparently don't want the site fixed. If I was negative, it was because you were doing the typical scripting for nothing and code bloat for nothing that I see time and time and time and time and time again where people empty a Mac-10 into their website's head and then wonder why it has problems... the same stupid malfing problems I've been seeing pretty much since the mouth-breathing nonsense known as jQuery was introduced leading person after person down the same garden path to failure. My disgust to the point of nausea at the stupidity of how people are using JS right now may have boiled over into that post, but it's out of frustration at seeing another website that has something almost resembling potential ruined by jQuery and a host of other "I can haz a wabsitz?" practices.

    With things like jQuery and "AJAX for nothing"... Your page would be faster and smoother WITHOUT any of that nonsense if you simply practiced minimalist markup and separation of presentation from content.

    You don't want to hear the truth and am unwilling to take that observation seriously, have fun with your bloated slow loading steaming pile of fail you apparently don't want to fix.
    Copy linkTweet thisAlerts:
    @arvgtaauthorNov 03.2014 — since the mouth-breathing nonsense known as jQuery was introduced leading person after person down the same garden path to failure. My disgust to the point of nausea at the stupidity of how people are using JS right now may...[/QUOTE]

    That is nonsense in itself, and shows that you only seem to know "black and white" and nothing in between. I do sports, whenever frustrated and can only recommend the same!

    Anybody else?
    Copy linkTweet thisAlerts:
    @arvgtaauthorJan 21.2015 — Hello all,

    almost one year later, but the problem persists.

    Here's what the URLs look like, for clarity:

  • - http://www.oeko-fakt.de/produkte/details/?pid=alsecco-alprotect-aero


  • or

  • - http://www.oeko-fakt.de/produkte/details/?pid=renault-zoe


  • In the meantime, I've tried all sorts of settings in Webmaster Tools in vain.

    Any ideas?


    Thanks and

    Kind regards
    ×

    Success!

    Help @arvgta spread the word by sharing this article on Twitter...

    Tweet This
    Sign in
    Forgot password?
    Sign in with TwitchSign in with GithubCreate Account
    about: ({
    version: 0.1.9 BETA 5.18,
    whats_new: community page,
    up_next: more Davinci•003 tasks,
    coming_soon: events calendar,
    social: @webDeveloperHQ
    });

    legal: ({
    terms: of use,
    privacy: policy
    });
    changelog: (
    version: 0.1.9,
    notes: added community page

    version: 0.1.8,
    notes: added Davinci•003

    version: 0.1.7,
    notes: upvote answers to bounties

    version: 0.1.6,
    notes: article editor refresh
    )...
    recent_tips: (
    tipper: @AriseFacilitySolutions09,
    tipped: article
    amount: 1000 SATS,

    tipper: @Yussuf4331,
    tipped: article
    amount: 1000 SATS,

    tipper: @darkwebsites540,
    tipped: article
    amount: 10 SATS,
    )...