Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hey, really need some help deciding what to do... I have a .co.uk site, its my oldest and best site of my network and accounts for maybe 30-40% of my income. Although its a .co.uk site, it actually makes most of its from from USA traffic and targets many terms for the US market - but the problem is that due to it being a .co.uk it doesnt rank as well in G .com and over the last few years Google has defiantly widened the gap as such for the ability for a .co.uk to rank in G .com. Many terms that I used to be #1 for in G .com, I now rank position 5-10 only, but in G .co.uk I'm #1 and often with a duo listing so I wouldnt put the loss of rankings in G .com down to just losing rankings naturally. Now many of my key pages are gradually losing rankings in G .com which is not good and really frustrating Feedback Needed So my dilemma is do I risk my best site and 301 it to a .com hosted in the US for potential at a guess 50% increase in revenues and more future potential (If the 301 worked well and got some US rankings back - Im sure longtail would increase lots too) ? If people with experience with 301ing sites to a new domain could let me know how they did or if you're an SEO and have done this many times, how many times on average has Serps remained stable / unchanged ? Trying to work out the reward to risk ratio, like on average if the transition is seamless 90% of the time it would seem worth the gamble, but if its 50% then I would say its not worth it.

    | goody2shoes
    0

  • Hello, I am currently number 1 for a competitive keyword - so don't want to push the wrong button and self destruct! My site is highly focused on one relatively narrow niche with about 50-60 pages of content bang on topic. I was wondering if Google will discredit my site in any way if I start adding pages that are** 'loosely related' **to the overall theme of my niche. Some of them are what you might call sister concepts with maybe one mention of my target keyword in the body..... Does the algo value what percentage of the whole site's content is on/ off topic? If so how important is this as a factor? Thanks a lot

    | philipjterry
    0

  • Quick and easy  most likely - Just need to clear a few point. I understand each page within the site should only have one H1 tag which should be the most important one. I also believe these only effect google ranking very slightly? right? Currently my CMS is system is pulling the H1 tag in from the page and automatically using the page heading that is on the page IE) the heading used for the content. Should this be a keyword /  key phrase instead? and will it be duplicate if i used the same one on various pages in my site? Cheers guys look forward to hearing your feedback

    | wazza1985
    0

  • I have a website that sells images. When you search you're given a page like this: http://www.andertoons.com/search-cartoons/santa/ I also give users the option to resort results by date, views and rating like this: http://www.andertoons.com/search-cartoons/santa/byrating/ I've seen in SEOmoz that Google might see these as duplicate content, but it's a feature I think is useful. How should I address this?

    | andertoons
    0

  • I offer users the opportunity to email and embed images from my website.  (See this page http://www.andertoons.com/cartoon/6246/ and look under the large image for "Email to a Friend" and "Get Embed HTML" links.) But I'm seeing the ensuing pop-up pages (Ex: http://www.andertoons.com/embed/5231/?KeepThis=true&TB_iframe=true&height=370&width=700&modal=true and http://www.andertoons.com/email/6246/?KeepThis=true&TB_iframe=true&height=432&width=700&modal=true) showing up in Google. Even worse, I think they're seen as duplicate content. How should I deal with this?

    | andertoons
    0

  • There has been over 300 pages on our clients site with duplicate page content.  Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs?  If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87  on the most of the duplicate pages are not supposed to be there.  I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.

    | dlemieux
    0

  • Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!

    | DigitalLeaf
    0

  • Below is my setup for redirects in .htaccess file in my root word press installation. The www to non-www works well, so no problems there Other page redirects work well, too (example: redirect 301 /some-page/ http://mysite.com/another-page/ (I didn't post those because I have a few too many : ) So here it goes... RewriteEngine On
    RewriteCond %{HTTP_HOST} ^www.mysite.com$ [NC]
    RewriteRule ^(.*)$ http://mysite.com/$1 [R=301,L] BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
    RewriteBase /
    RewriteRule ^index.php$ - [L]
    RewriteCond %{REQUEST_FILENAME} !-f
    RewriteCond %{REQUEST_FILENAME} !-d
    RewriteRule . /index.php [L]</ifmodule> END WordPress redirect 301 /archives/10-college- majors/ http://mysite.com/archives/10-college-majors/ redirect 301 /archives/10-college-%20majors/ http://mysite.com/archives/10-college-majors/ redirect 301 /archives/10-college-%C2%A0majors/ http://mysite.com/archives/10-college-majors/ I'm having a problem with the last 301 redirect: redirect 301 /archives/10-college-%C2%A0majors/ http://mysite.com/archives/10-college-majors/ not working... As you can see I've tried using other varations of the "space" but no go. I also used a redirect in cPanel's Redirect screen; testing all the possible options + wildcard I've also tried this: http://serverfault.com/questions/201829/using-special-characters-in-apache-mod-rewrite-rule (perhaps unsuccessfully, because it caused a 500 server error and it's a different situation in my case) I also saw something here: http://www.webmasterworld.com/apache/3908682.htm but I don't know if it works and how I would implement that + do so without compromising ALL other redirects. Note: the URL displays with a space in the address bar of all major web browsers: http://mysite.com/10-college- majors/  and goes to a 404 page I have a goregous page / PR6 / high authority site linking to the URL on my site, but they copied the URL with a space somehow. I contacted the person responsible for the website and he claims it works fine (aka he didn't check it). Is there a clean way to redirect ONLY this problematic URL without compromising other redirects, etc? Any ideas would be great. I'll respond with progress. Thanks in advance. UPDATE the   redirect works, and it did work. Even so, when looking at source of page linking to mine, the URL looks like this: ``` http://mysite.com/archives/10-college- majors/ Clicking the URL in Source View in FireFox takes me to ``` http://mysite.com/archives/10-college-%C2%A0majors/ none of my 301 redirects should direct there. I don't have any redirect plugins either.

    | pepsimoz
    0

  • If a client (e.g. a winery) wants to rank both nationally and locally, what are some best practices for doing this on one Website? So the goal is to: Rank nationally for their wines, wine varietals, etc.so they're found by restaurants, distributors, customers (could include national directories, content creation ,etc.) Rank locally for their tasting room and wines for people looking locally or looking at that specific region (this could also include include Google places, local directories, etc.). I'm wondering if the site would need to be subdivided (or "siloed") where one section is heavily focused on national and another is on regional? Also, for the home page, which focus would be most important (maybe national because it's harder)? Thanks a for any ideas! Tom

    | DirectionSEO
    0

  • I have a client who runs a yacht delivery company. He gets business from the US and the UK but due to the nature of his business, he isn't really based anywhere except in the middle of the ocean somewhere! His site is hosted in the US, and it's a .com. I haven't set any geographical targeting in webmaster tools either. We're starting to get some rankings in google US, but very little in google UK. It's a small site anyway, and he'd prefer not to have too much content on the site saying he's UK based as he's not really based anywhere. Any ideas on how best to approach this?

    | PerchDigital
    0

  • I am currently working on a huge website which ranks very well receiving 150,000 visitors every day. I have been offered the chance to buy some more domain names which would suit my keywords in the current site. These domains as a keyword also receive huge amounts of traffic. Would it be beneficial for me to do this....if so why? Thanks

    | wazza1985
    0

  • Why is this the case?  If there is a filter or penalty on my site from yahoo, how do I ask yahoo to correct this?

    | DavidS-282061
    0

  • I am trying to work out exactly how Google is crawling my site including entry points and its path from there. The site has millions of pages and hundreds of thousands indexed. I have simple log files with a  time stamp and URL that google bot was on. Unfortunately there are hundreds of thousands of entries even for one day and as it is a massive site I am finding it hard to work out the spiders paths. Is there any way using the log files and excel or other tools to work this out simply? Also I was expecting the bot to almost instantaneously go through each level eg. main page--> category page ---> subcategory page (expecting same time stamp) but this does not appear to be the case. Does the bot follow a path right through to the deepest level it can/allowed to for that crawl and then returns to the higher level category pages at a later time? Any help would be appreciated Cheers

    | soeren.hofmayer
    0

  • I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
    http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus

    | kurus
    0

  • Hey experts, how you doing? Hope everything is ok! I'm about to launch a new website, the code is almost done. Totally fresh new domain. The site will have like 500000 pages, fully internal optimized of course. I got my taticts to make G "travel" over my site to get things indexed. The problem is: to release it in "giant mode" or release it "thin" and increase the pages over the time? What do you recomend? Release the big G at once and let them find the 500k pages (do they think this can be a SPAM or something like that)? Or release like 1k/2k per day? Anybody know any good aproach to improve my chances of success here? Any word will be apreciated. Thanks!

    | azaiats2
    0

  • Is there any SEO benefit for links shared in Facebook feeds or wall posts?

    | NinjaTEL3
    0

  • Hello all, All our category pages www.pitchcare.com/shop are linked to from every product page via the sidebar navigation. Which results in every category page having over 1700 links with the same anchor text. I have noticed that the category pages dont appear to be ranked when they most definately should be. For example http://www.pitchcare.com/shop/moss-control/index.html is not ranked for the term "moss control" instead another of our deeper pages is ranked on page 1. Reading a previous SEO MOZ article ·  Excessive Internal Anchor Text Linking / Manipulation Can Trip An Automated Penalty on Google
    I recently had my second run-in with a penalty at Google that appears to punish sites for excessive internal linking with "optimized" (or "keyword stuffed anchor text") links. When the links were removed (in both cases, they were found in the footer of the website sitewide), the rankings were restored immediately following Google's next crawl, indicating a fully automated filter (rather than a manual penalty requiring a re-consideration request). Do you think we may have triggered a penalty? If so what would be the best way to tackle this? Could we add no follows on the product pages? Cheers Todd

    | toddyC
    0

  • In 2008 we performed an experiment which showed some seemingly random behaviour by Google (indexation, caching, pagerank distributiuon). Today I put the results together and analysed the data we had and got some strange results which hint at a possibility that Google purposely throws in a normal behaviour deviation here and there. Do you think Google randomises its algorithm to prevent reverse engineering and enable chance discoveries or is it all a big load balancing act which produces quasi-random behaviour?

    | Dan-Petrovic
    0

  • I have a site that is successful on the SERPs for a certain geography, let's call it City A (I'm sure you can't tell what it is from my username). I'm moving to a new city in another state so I will be building my business in this area (City B). Should I create a new domain for City B with CityBWebsiteDesign.com or should I create a sub-domain called CityB.BrandableCompanyName.com and just redirect CityBWebsiteDesign.com to the URL for offline marketing purposes only? My current website BrandableCompanyName.com has some authority with Google. Will it be better to building something on the sub-domain and get any sort of cross-benefits or are there really no benefits to be had between sub-domains? The benefit of going with CityBWebsiteDesign.com would be having a keyword rich URL but I would basically be starting from zero with building authority. Specific experience you've had with this or cited examples would be great for the discussion! Thanks,
    Jared

    | JaredDetroit
    0

  • By paying you guys each month will you be making my website more visible and accessible or will you only point out the mistakes I should fix?

    | vacksah
    0

  • For some reason, our urls are set to change from “www.apprenda.com/ANYTHING" to “apprenda.com/ANYTHING” These register as different pages though?  We have rankings in SEOMoz Pro for terms where our homepage shows up 6th on google, but SEOMoz says it's not on the first page because it's checking against apprenda.com and not www.apprenda.com Also, it seems like for some reason pages with trailing slashes also register differently than those without. Should we be doing something for that? Something to make sure all pages get rewritten to having the trailing slash or not? For instance, this url: http://apprenda.com/saasgrid/features/multi-tenancy/ and this url” http://apprenda.com/saasgrid/features/multi-tenancy are really the same page.  Yet in our analytics, they register as different pages with their own stats, etc. What should we do in our particular case, and how can we get this fixed? I really appreciate the help, and thanks in advance! Jesse

    | ApprendaPlatform
    0

  • I whant to specify a link rel cannonical for each category page, how to do that without changing the code (just from admin section), because filters and sorting search are making the site dublicate content with their parameters; If there is a way please specify the method, i whant to avoid hours of working in a script like this. Thank's.

    | oneticsoft
    0

  • I'm using a forum plugin called Simple Press, and the rest of my site is looking good with only a few minor errors due to a long url. Anyway, the only 4 major errors I have are these; These 3 links have no titles, so is there somewhere I can give them titles, or do a rel=nofollow? /index.php?sf_ahah=acknowledge /index.php?sf_ahah=permissions /index.php?sf_ahah=tags And then the 3 above plus this one; http://www.societyforethicsand…..?xfeed=all Have no META DESCRIPTION associated with them. So, is there somewhere I can add the meta description for all 4? I have spoken to support, and it turns out the first 3 links with no titles are ajax content for pop ups, instead of waiting for them to work out how to resolve this issue, does anyone know how to stop them coming up as major errors?

    | CosmikCarrot
    0

  • Hi all, For a project I'm working on there will be an opportunity to have a number of websites link back to our main site. Rather than giving out a straight forward link text I'm more interested in building and handing out some kind of widget which is topical to both us and the websites giving us links. Although I do a fair bit of web development with various technologies I have never played around with building widgets as such which can pull in data feeds from our database etc... Does anyone have any good recommendations of tutorials covering this area or alternatively any companies offering this kind of widget building service. Thanks in advance, Darren

    | DarrenAtkinson
    0

  • I have a particular Page which shows primary contact details as well as "additional" contact details for the client. GIven I do not believe I want Google to misinterpret the focus of the page from the primary contact details which of the following three options would be best? Place the "additional" contact details (w/maps) in Javascript, Ajax or similar to suppress them from being crawled. Leave "additional" contact details alone but emphasize the Primary contact details by placing the Primary contact details in Rich Snippets/Microformats. Do nothing and allow Google to Crawl the pages with all contact details Thanks, Phil

    | AU-SEO
    0

  • We carry a few brands that have special foreign characters, e.g., Kühl, Lolë, but do search engines recognize special unicode characters? Obviously we would want to spend more energy optimizing keywords that potential customers can type with a keyboard, but is it worthwhile to throw in some encoded keywords and anchor text for people that copy-paste these words into a search? Do search engines typically equate special characters to their closest English equivalent, or are "Kuhl", "Kühl" and "Kühl" three entirely different terms?

    | TahoeMountain40
    0

  • Trying to figure out how to best optimize timing of new content... including blogs and other on page content?

    | AaronSchinke
    0

  • If a user runs a search that returns no results, and the server returns a 204 (No Content), will Googlebot treat that as the rough equivalent of a 404 or a noindex? If not, then it seems one would want to noindex the page to avoid low quality penalties, but that might require more back and forth with the server, which isn't ideal. Kurus

    | kurus
    0

  • Hello All, I am optimizing three websites for a services based company in the South Jersey Area. Of course within South Jersey there are certain counties, cities and towns I would like to show up for. For example- Pool Cleaning South Jersey Pool Cleaning Cherry Hill NJ Pool Cleaning Burlington County NJ Pool Cleaning Voorhies NJ Pool Cleaning. Do I need to create a page on my websites for every possible county, city and town I want to rank for? This would entail creating thousands of pages targeting these geographic keywords. I have seen other similar sites just list all the counties, cities and towns they service in the footer and it seems to work. Of course this would be beneficial for any business who is looking to not only rank in their home base but a predetermined radius around their home base as well. Thanks so much, Bill

    | wparlaman
    0

  • We're looking for a company that can help us optimize our google product feed.  Does anyone have any recommendations or suggestions? Thanks!

    | eric_since1910.com
    0

  • I catched a dropped domain with a nice keyword, but poor reputation. It used to have some malware on the site and WOT (site review tool available at Chrome among others) has very negative reviews tied to the site. I guess that Google has to have records about that as well, because Chrome used to prompt a warning when I entered the site. My question is: how long will the bad reputation last if I build a legitimate website there?

    | zapalka
    0

  • Hi, I just wanted to get some clarification on whether Google would penalize your site if you had many links coming from a questionable site. We've been struggling with rankings for years even though we have one of the oldest sites in the industry with a good link profile and the site is well optimized. I was looking through webmaster tools and noticed that one website links to us over 100,000 times, all to the home page. The site is www.vietnamfuntravel.com. When I looked at the site it seems that they operate a massive links exchange, I'm not sure what the history is and why they link to us so much though. Is there any chance that this could impact us negatively? if it is then what would be the best way to deal with the situation? I could ask them to take the links down but can't guarantee they would do it quickly (if at all). Would blocking their domain from our htaccess file have the desired effect?

    | Maximise
    0

  • What advice do you have for achieving verification for Google Places for a client? I have a client at the moment and I tried getting the call sent through and I'm not sure what happened but a couple of tries at this did not work. I've tried the post card way and I'm still waiting. Do I need to be more patient in Australia for this verification post card? Is there a way I can verify the info myself? note: I have set up a seperate email that there business email to handle a lot of the link building but this is different to there business email which Google uses.

    | iSenseWebSolutions
    0

  • My site has faceted navigation that allows shoppers to filter category page results by things brand, size, price range, etc.  These pages 302 redirect to the same page they came from, which already include canonical meta tags.  I added the rel="nofollow" attribute to the facet links and added the line "Disallow: /category_filter/" to robots.txt. One of our SEO consultants told me that this is likely diluting the potency of the page's link juice since they are divided among all the page's links, including the links I am instructing crawlers to disregard. Can anybody tell me whether I am following the best practices for links that redirect to the same page?

    | TahoeMountain40
    0

  • Hello All! Our site uses dynamically generated pages. I was about to begin the process of optimising our product category pages www.pitchcare.com/shop I was going to use internal anchor text from some high ranking pages within our site but each of the product category pages already have 1745 links! Am I correct in saying that internal anchor text links works to a certain point? (maybe 10 or so links) So any new internal anchor text links will count for nothing? Thanks Todd

    | toddyC
    0

  • If I have 2 domains with different content that are in same topic, and each one lives on its own IP-address, what could be the result if I do permanent redirect of just one internal page from one domain to counterpart page of another? What if I use rel=canonical instead of R301? Thank you!

    | kolio_kolev
    0

  • We are switching one of our sites to a magento site and dont want to loose current rankings what are the best practices for this?  Same Domain but the deep url pages will change urls

    | DavidKonigsberg
    0

  • Hi, I'd like to ask you what should I do in my situation. I've shorted my URLs from something like this: domain.com/module/action/type/id/keyword to this: domain.com/keyword After 301 SERP refreshed and position stayed the same (yea, lucky me :). After 2 days I got some hight PR links (4 and 5). After 8 days my new URL disapprear to one keyword. Now this take 6 days... I've removed these links and still no results. So the question is - what should I do? Remove new url and replace it with old one, get new links?

    | sui
    0

  • I have recently come across several bloggers that have been trying to formulate the best concise definition of SEO. What one sentence definitions have you used / seen? Avoid run-ons. Tweetable, even better.

    | Gyi
    0

  • Let's suppose that I want to rank for the keyword "hotels". If I put this keyword in ALL of the link anchor texts then Google will very likely penalize the site. My question is: How many keyword variations should I use in anchors (provided I want to rank for just one KW i.e. "hotels")? Would one keyword variation be okay and is it fine to use main keyword in 80% anchors and the keyword variation(s) in just 20% anchor texts, such as : hotels 80% cheap hotels 20% Note: I do not want to rank for "cheap hotels", just want to use it as an anchor variation of my desired keyword "hotels". Thanks!

    | RightDirection
    0

  • I'm curious if anyone here running a large, complex, dynamic site has used the Apache server mod_rewrite module  to simplify their site's URLs by rewriting them in a standard format. The chief use of this module for SEO purposes would be to aid in canonicalization and reduce duplicate content.  For example, you could easily convert all of you ALL CAPS or MixedCase URLs to lower case, change all "/index.html" URLs to just point to "/", change all word seperators to hyphens, and so on. Any server-side ninjas out there with stories to tell?  🙂

    | jcolman
    0

  • hi, i have way to many 302 redirects, how can i bulk change these to 301 i have started in cpanel but i could be old by the time i finsih

    | freedomelectronics
    1

  • Hello, I have what a I think it's a noob question.. I have a medium size website and need to put it into maintenance for the next 2 months, and afterwards activate a completly new site. My client asked me to do this, cause the same people whoe run the constant flow of information on the site, are the ones who are going to develop the new site, so he wants to just close it out So... what are the steps for doing this with minimum impact on any SEO advances made this past months?.. How do I tell the search engines, Hey, just under maintenance for a while....then... i'm back in the game but this is my new structure. and the old one should go here

    | daniel.alvarez
    0

  • Is there any way to predict if and how Organic traffic would change if we sucesfully added some high-quality links to our website? Quantifying link value would help to plan how much time/efforts we should spend on quality link-building. I understand that the more good links we get - the better. But beyond that, I am looking for some methodology/data/formulas that would help to decide if links are worth pursing. Here is an example: Let's say we acquired 20 high-quality links from PR 0-5 pages of some trusted web sites of PR6-8. Let's say that on these pages would also link to 10-20 other web sites. Would such campaign be of some direct value to our ecommerce website of PR6? My question is limited to how high-quality links improve overall Google search traffic to the website only. I am not interested in calculating value of individual keywords - most of our search traffic comes from long tail. I am also not interested in how to estimate referral traffic - both seem much easier topics to tackle. But how would I be able to measure the value of lets say 1 link from PR 8 site with a PR3 page, when there are 10 other external links on that page?

    | Quidsi
    0

  • If my site has just five content pages, instead of 25 or 50, then will it get penalized by Google for a given moderately competitive keyword?

    | RightDirection
    0

  • Prob the most n00b question of all, but once I understand this I will be able to research on my own from here: If a search engine produces results by the keywords from individual website posts/pages, then how are the keywords I choose for my homepage so important if the general homepage meta-tag keywords are essentially ignored by the search engines? Should I repeat my primary keywords on EVERY post, in addition to the ones that relate to that individual post or am I misunderstanding something fundamental? My new site is http://splatterMUSIC.com and I want to be at the top of the results for anyone wanting to watch music vlogs, album reviews, music lessons, funny music-related videos, new non-major label music videos, and all kinds of other concert footage, etc.

    | SEOsolver
    0

  • Hi there! Some doubts are confusing my head and need some assistence from you to get on the right track. I'll explain my situation and want to hear from you what do you really recommend for med/long term permanent results. 1 - I have a PR2 (.com.br) domain; 2 - I'm talking about little/med competition micro-niche keywords; 3 - I got all pages I want to, indexed (I have a well SEO constructed website with internal link building); 4 - If a keyword has average competition, I'll already start ranking in page #3 on the SERP's; For a few low competition keywords I start on page #1; 5 - I do a little whitehat link building, 1 or 2 backlinks on authority sites and then like 15 days later I came to page #1, generally on position 9/10; And then I got stucked 🙂 No more authority sites where I can get backlinks... I do some posts on the company twitter/facebook page's, but they are no follow, so I don't really now if this can help. (never see a SERP result). I did some "blackhat" stuff to see if it really work: I can say for sure the "profile backlinks" that we can buy from some sites doesn't work (maybe it's just for me). I can't see it on webmaster tool and neither my ranks changed since I bought a pack of 100 links (the links are working, I see it one by one) to test. Maybe the problem is about the domains, cause my site is .com.br and I'm buying .com profile links. I guess google understand backlinks from .com.br more valuable for my sites. Back to whitehat: I wrote some articles and posted it the right way, of course on .com.br articles sites, got it indexed and can see the backlink on webmaster tool, but no change on SERP's. (maybe this can be a long term result and I'm not seeing it yet). I'm really "scratching my hand" to do some blackhat stuff, but I don't want to lose what I already have done... I heard a lot about scrapebox but doesn't fell confortable to spam as hell a lot of blogs. I really want long term permanent results (my sites are totally whitehat/corporate sites). Can you expert guys give me some point to where I need to "walk" now to improve the SERP's? I never reached top #1 and want to try to rank at least one time to understand how this can be made... I'm thinking now to pay someone to rewrite 20 copies of an article and up it on some sites, to see if 20 can improve something. But still no confident, because it will cost like $100 for a good writer do it for me on my language. Maybe I can do better things with 100 bucks. I guess I did the path right: Internal SEO -> got indexed -> backlinking from authorities -> new articles backlinks to me (is it ok at this position or no?) -> (what next ?) I know SEO is a hard/never ending work, but what I'm trying to get cleaned on my head is the path of the work (if a right path really exists). Every word will be apreciated. What do you can suggest to me to try now? (please give me a hint to see SERP's results 🙂 if I feel that something worked, no matter how it can cost to me, but I'll pay for the work happily) Sorry if I'm a little confusing, english isnt' my first language. Thanks.

    | azaiats2
    0

  • I've read several blogs discussing how including more than one H1 per page is a serious no no. However, what is the most effective <h>tag to use for your global navigation system. Or should it not be an <h>tag period?</h></h>

    | calin_daniel
    0

  • Hi guys, I have been persevering with this ranking for some time now and thought you might be able to help. Or direct me to where I can get help. I am learning a lot through SEOmoz but I am still very green. Basically on the 20th of the 12th we jumped up to a 2nd place listing and then dropped back down on the 17th of the 1st 2011. The site is http://mlb.broomeaccommodation.com.au and the search term is 'Broome Accommodation'  as you can see it is a considerable drop and really affecting our bookings and sales figures. I have attached a link to a screen capture of the problem - http://exitforward.com/kimberleyaccomm/seomoz.png Interested to hear your thoughts and get some help on this frustrating matter Kind regards Bodie Czeladka

    | Bodie
    0

  • If a site removes thousands of pages in one day, without any redirects, is there reason to think Google will penalize the site for this? I have thousands of subcategory index pages. I've figured out a way to reduce the number, but it won't be easy to put in redirects for the ones I'm deleting. They will just disappear. There's no link juice issue. These pages are only linked internally, and indexed in Google. Nobody else links to them. Does anyone think it would be better to remove the pages gradually over time instead of all at once? Thanks!

    | Interesting.com
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.