the same problem exists on separate pages. A robot won't be able to find them because the href links would be sthg like href="#sub-page.html"
I think there is a way to manage this:
on stacked pages
- use the "google section", i never used it but there is html comments for google bot, try sthg yourself coz i won't for this case.
on separated pages
1- create a sitemap referencing every pages, i hope this would work
If step 1 (or similar) is not referencing anything, just forget about step 2, it'd be unuseful at all.
My explaination might be confusing, if someone understood me, please rewrite it so everyone understand.
It's been two years since the original post. In the mean time, JQM has been widely adopted. One would hope that Google, more than any, keeps up with trends.
However, placing ads on multiple JQM pages still technically violations the terms of service of many ad networks, because you might put too many ads in the same document. (Since JQM loads multiple pages in the same document - always at least two pages if you do a transition.)
That's more of a technical violation with no real harm done, though.
If you create a multi-page document, though, then there is a real problem, and it's not easy to solve. The ads on various pages will all be loaded up front. Now you have bumped download counts, but the ads may not ever be used. Really screws up the metrics.
So, ad networks will need to develop code that is both compatible with JQM and solves the issue of counting views.
Leave a comment on watusiware's reply
Change topic type
Link this topic
Provide the permalink of a topic that is related to this topic