SSL Cert error
-
Just just implemented SSL with a wild card cert and I got an email from google that my non-www cert is not valid.
Any ideas ?
SSL/TLS certificate does not include domain name https://electrictime.com/
To: Webmaster of https://electrictime.com/,
Google has detected that the current SSL/TLS certificate used on <a>https://electrictime.com/</a> does not include <a>https://electrictime.com/</a> domain name. This means that your website is not perceived as secure by some browsers. As a result, many web browsers will block users accessing your site by displaying a security warning message. This is done to protect users’ browsing behavior from being intercepted by a third party, which can happen on sites that are not secure.
-
I suggest to just do redirection from non www to www or http to https version. This link may be helpful for you.
-
How did you impliment it? Do you have access to Apache? if you do This is how I have about 20 of my websites SSL CRT/KEY files setup if you want something to check it with. IF you did it with php.ini probably will need to check the walkthrough with your webhost.
<virtualhost *:443=""></virtualhost>
Serveradmin some@email.com
ServerName website.com
ServerAlias www.website.com
DocumentRoot /var/www/html/website/
SSLEngine on
SSLCertificateFile "/var/key/website.com.crt"
SSLCertificateKeyFile "/var/key/website.com.key"
allow from all
Options Indexes FollowSymLinks MultiViews
Require all granted
AllowOverride All
Order allow,deny
-
Hi Thomas
If you registered the SSL for www then it won't necessarily cover the non www. You should make sure that the version you sign up for SSL with matches the configuration you are using. If you bought an SSL for www then use www.
I have this issue with one of our sites but it is only saying this in the Opera browser.
https://stackoverflow.com/questions/40309552/do-i-need-an-ssl-certificate-for-www-and-non-www
Regards
Nigel
-
Hi There,
Your website is accurately redirected to the https://www version, I won't worry about the error as non-www version is not relevant in your case. You may also read the following articles why this problem might have come:
https://www.clickssl.net/blog/do-i-need-different-ssl-certificates-for-www-non-www-domain
https://www.quora.com/How-can-an-HTTPS-certificate-work-for-both-the-www-and-non-www-domains
I hope this helps, let me know if you have further queries.
Best Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify: AggregateRating Schema Error
Hi lovely community, I know google made some schema changes in Sept 2019. I got an AggregateRating Error:
Intermediate & Advanced SEO | | Insightful_Media
One of offers or review or aggregateRating should be provided. I am using a third-party app 'Shopify Product Review' to implement the rating. What I should do to solve this error. Thanks very much for the help! I found many people have this issue too in the community! Many thanks Pui0 -
I added an SSL certificate this morning and now I noticed duplicate content
Ok, so Im a newbie, therefor I make mistakes! Lots of them. I added an SSL certificate this morning bc it was free and I read it can help my rankings. Now I just checked it in screaming frog and saw two duplicate content pages due to the https. So im panicking! What's the easiest way to fix this?? Can I undue an SSL certificate? I guess what's the easiest that will also be best for ranking. Thank you!! Rena
Intermediate & Advanced SEO | | palila0 -
Google Webmaster tools -Fixing over 20,000+ crawl errors
Hi, I'm trying to gather all the 404 crawl errors on my website after a recent hacking that I've been trying to rectify and clean up. Webmaster tools states that I have over 20 000+ crawl errors. I can only download a sample of 1000 errors. Is there any way to get the full list instead of correcting 1000 errors, marking them as fixed and waiting for the next batch of 1000 errors to be listed in Webmaster tools? The current method is quite timely and I want to take care of all errors in one shot instead of over a course of a month.
Intermediate & Advanced SEO | | FPK0 -
Site recovery after manual penalty, disavow, SSL, Mobile update = but dropped again in May
I have a site that has had a few problems over the last year. We had a manual penalty in late 2013 for bad links, some from guest blogs and some from spammy sites. Reconsideration requests had me disavow almost all of the incoming links. Later in 2014, the site was hit with link injection malware and had another manual penalty. That was cleared up and manual penalty removed in Jan 2015. During this time the site was moved to SSL, but there were some redirect problems. By Feb 2015 everything was cleared up and a an updated disavow list was added. The site recovered in March and did great. A mobile version was added in April. About May 1st rankings dropped again. Traffic is about 40% off it's March levels. Recently I read that a new disavow file will supersede an old one, and if all of the original domains and URLs aren't included in the new disavow file they will no longer be disavowed. Is this true? If so, is it possible that a smaller disavow file uploaded in Feb would cause rankings to drop after the May 3 Quality update? Can I correct this by disavowing all the previously disavowed domains and URLs? Any advice for determining why the site is performing poorly again? We have well written content, regular blogs, nothing that seems like it should violate the Google guidelines.
Intermediate & Advanced SEO | | Robertjw0 -
Question regarding error url while checking in open-site explorer tool
Hello friends, My website home url & inner page url shows error while checking the open-site site explorer tool from SEOMoz, for a website** eg.www.abc.com website as**
Intermediate & Advanced SEO | | zco_seo
**"Oh Hey! It looks like that URL redirects to www.abc.com/error.aspx?aspxerrorpath=/default.aspx. Would you like to see data for that URL instead?"**May I know the reason, why this url showing this result while checking back link report from the tool?**May I know on what basis this tool is evaluating the website url as well?****May I know, Will this affect the Google SERPs for this website?**Thanks0 -
Wordpress error
On our Google Webmaster Tools I'm getting a Severe Health Warning regarding our Robot.txt file reading: User-agent: *
Intermediate & Advanced SEO | | NileCruises
Crawl-delay: 20 User-agent: 008
Disallow: / I'm wondering how I can fix this and stop it happening again. The site was hacked about 4 months ago but I thought we'd managed to clear things up. Colin0 -
Making a whole site website SSL (https)
Our IT department wants to make a change to our website and serve all the pages under SSL (https). This will be happening at the same time as a move from classic ASP to ASP.Net so the file extensions for non re-written urls will change (this doesn't equate to many). They will be ensuring everything is 301 redirected correctly. Even with this I can't help being very nervous about the change. We have tens of thousands of links to the website gained over many years, and I understand even with 301'ing them they will lose some of their value. We receive tens of thousands of natural visitors per day. Has anyone done anything like this before, or have any advice on whether it is the right thing to do?
Intermediate & Advanced SEO | | BigMiniMan0 -
400 errors and URL parameters in Google Webmaster Tools
On our website we do a lot of dynamic resizing of images by using a script which automatically re-sizes an image dependant on paramaters in the URL like: www.mysite.com/images/1234.jpg?width=100&height=200&cut=false In webmaster tools I have noticed there are a lot of 400 errors on these image Also when I click the URL's listed as causing the errors the URL's are URL Encoded and go to pages like this (this give a bad request): www.mysite.com/images/1234.jpg?%3Fwidth%3D100%26height%3D200%26cut%3Dfalse What are your thoughts on what I should do to stop this? I notice in my webmaster tools "URL Parameters" there are parameters for:
Intermediate & Advanced SEO | | James77
height
width
cut which must be from the Image URLs. These are currently set to "Let Google Decide", but should I change them manually to "Doesn't effect page content"? Thanks in advance0