Showing posts with label htaccess. Show all posts
Showing posts with label htaccess. Show all posts

CodeIgniter Remove index.php By .htaccess

In this tutorial I am going to show how to remove index.php from URL using .htaccess file in CodeIgniter. htaccess is the shortened used for Hypertext Access, which is a powerful configuration file that controls the directory “.htaccess”. It is used by Apache based web servers to control various server features.
Now The Question Arises Why Exactly do we want to remove index.php..?? As CodeIgniter’s URLs are designed to be search engine friendly and to human too. And so to maintain the standards we need to remove the default index.php which appears in the url of codeigniter applications, so that the url becomes search engine friendly and looks clean.
The script contains the code with proper documentation to perform the following transformation.


Steps To Remove index.php using .htaccess:-

Step:-1  Open the file config.php located in application/config path.  Find and Replace the below code in config.php  file.
//  Find the below code

$config['index_page'] = "index.php"

//  Remove index.php

$config['index_page'] = ""
Step:-2  Go to your CodeIgniter folder and create .htaccess  file.

Path:
Your_website_folder/
application/
assets/
system/
user_guide/
.htaccess <--------- this file
index.php
license.txt
Step:-3  Write below code in .htaccess file
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ index.php/$1 [L]
</IfModule>
Step:-4  In some cases the default setting for uri_protocol does not work properly. To solve this issue just open the file config.php located in application/configand then find and replace the code as:
//  Find the below code

$config['uri_protocol'] = "AUTO"

//  Replace it as

$config['uri_protocol'] = "REQUEST_URI" 

Conclusion:

Hope these steps helped you to remove index.php in CodeIgniter framework using .htaccess. Keep reading our blogs. 

Force www With htaccess

For SEO it is important to use a set standard of how you present your domain. The most popular approach is to use the sub-domain www. Many people don't know this but http://www.domain.com is the same as http://domain.com, you will be sent to the same page but they will have different URLs.
For SEO if a search engine saw http://www.domain.com it will crawl this page, check the content and index the content. Then later if it goes and crawls http://domain.com the search engines will see the same content but on different URLs and think it is duplicate content and therefore will penalise this domain in the search engine ranking.

Consistent URLs

The best way to get around this problem is to make sure that all your URLs on your site use a consistent URL format either www.domain.com or domain.com. It makes no difference which format you choose, the search engine just worries about duplicate content.

Use www

I prefer to make sure all my domains use www sub-domain, this is the format most people are used to seeing on search engines and web sites. If people are used to seeing it then they will not worry about the URL.
Now that you have chosen your preferred format and changed all your links to your site to use this format then you are done, right, wrong! If you stick just with this you run the risk of someone linking to you but forgetting to put a www on the front, the search engines will then crawl through this link and see duplicate content and penalise your domain.
To combat this problem you have a couple of options the best solution is to perform a check of the URL on each page if it doesn't contain the www then perform a 301 redirect to your domain with the www sub-domain. The best way of using this solution is to use htaccess, this will run before any pages are loaded and allows you to change the URL with a 301 redirect.

.htaccess To Redirect URLs

To use htaccess to redirect URL just copy and paste the snippet below and replace example.com with your domain.
RewriteEngine on
RewriteCond %{HTTP_HOST} ^example\.com [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [L,R=301,NC]

Add www With HTTP/HTTPS

RewriteCond %{HTTP_HOST} !^$
RewriteCond %{HTTP_HOST} !^www\. [NC]
RewriteCond %{HTTPS}s ^on(s)|
RewriteRule ^ http%1://www.%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

Reduce Comment Spam With htaccess in Wordpress

Comment spam is a problem with many blogs, one of the biggest is spam bots crawling sites and automatically submitting comments to your site. The below code snippet will check the HTTP_REFERER on the wp-comments-post.php, it will make sure that you can only get to this page if you are on the current domain.
Below is a snippet you can use on your .htaccess to help reduce the amount of spam coming through on your comments page on your WordPress blog.
Just add the following into your htaccess file to reduce spammers.
<IfModule mod_rewrite.c> 
RewriteEngine On 
RewriteCond %{REQUEST_METHOD} POST 
RewriteCond %{REQUEST_URI} .wp-comments-post.php* 
RewriteCond %{HTTP_REFERER} !.*yourdomainname.* [OR] 
RewriteCond %{HTTP_USER_AGENT} ^$ 
RewriteRule (.*) ^http://%{REMOTE_ADDR}/$ [R=301,L] 
</IfModule>
This snippet will check the the URL coming to the wp-comments-post.php file is from your own domain.

Blacklist IP Addresses With htaccess

Here is a good snippet to add to your htaccess file to completely block spammers from your site.
If you have a WordPress site you could get a lot of spam comments, it can take up a lot of time of your day to go through and delete these spam comments. There are a few plugins you can use to delete these spam comments for you or you can use htaccess to block the spammers from even getting to your site. On the WordPress comment page it will record the IP Address, if you know this user is a spammer you can copy the IP Address into your htaccess to block them from ever coming back.
Copy and paste the following and replace with the IP address you want to the deny from xxx.xxx.xxx.x.
<Limit GET POST PUT>
order allow, deny
allow from all
deny from 123.123.123.1
deny from 555.555.555.5
deny from 000.000.000.0
</Limit>
If you want to disable access to a certain file using htaccess use the following snippet to block access to the login page.
<files wp-login.php>
order deny,allow
deny from all
</files>
This functionality is really useful when you are developing a new site and want to place this on a live server but don't want it to be accessible to the outside world. Using this code you can block anyone from seeing your site unless they have a certain IP address.
Therefore you can open the site to your designers, testers, your wireless network, HTML validators etc to test your site throughout before it gets opened up to the public.
<Limit GET POST PUT>
     Order Deny,Allow
     Deny from all
     
     # Designer IP
     Allow from 111.222.333
          
     # Tester IP
     Allow from 777.888.999
     Allow from 123.456.789
     Allow from 456.789.123
     Allow from 789.123.456
     
     # Wireless
     Allow from 000.111.222
     
     # W3C CSS & HTML validators
     Allow from 654.789.321
</Limit>

Set Expire Headers In htaccess

A way that Browsers try to help speed up rendering the page is by taking static data and cache it. These are things like images, CSS and JavaScript files. As Browsers are going to cache these bits of data you can actually set an expiry so the Browser will know not to cache these anymore.
The Browser will keep serving these until the date you set.

Images

Images Mostly on websites an image on the page will never change. By change I mean have the same image URL but use a different image. There may be times that you use a different image with a different URL but this will be a new cache for the Browser.
So if you know that your image isn't ever going to change then you can set the expiry on these items for a long time in the future so the Browser will allows get this data from it's own cache.

CSS And JavaScript Files

CSS File CSS files can be cached by the browser, you may even see this when you change the CSS file, refresh your browser and the styles don't change. You go into the CSS file and it has the new styles, so you refresh your browser again and now the new styles are coming through.
This is your browser caching your CSS files.
You can set an expiry on these files but it depends how often your website CSS is going to change. If you change it often you may only want to set an expiry of a couple of days. If your CSS doesn't change often then you can set a longer expiry.

Set Expiry Date using htaccess

To set your expiry date using htaccess I like to use the example from HTML5 Boilerplate, as it will take care of everything you will ever want to cache.
# ----------------------------------------------------------------------
# Expires headers (for better cache control)
# ----------------------------------------------------------------------
 
#
# These are pretty far-future expires headers
# They assume you control versioning with cachebusting query params like:
#   <script src="application.js?20100608">
# Additionally, consider that outdated proxies may miscache
#
#   www.stevesouders.com/blog/2008/08/23/revving-filenames-dont-use-querystring/
 
#
# If you don`t use filenames to version, lower the css and js to something like "access plus 1 week"
#
 
<IfModule mod_expires.c>
  ExpiresActive on
 
# Perhaps better to whitelist expires rules? Perhaps.
  ExpiresDefault                          "access plus 1 month"
 
# cache.appcache needs re-requests in FF 3.6 (thx Remy ~Introducing HTML5)
  ExpiresByType text/cache-manifest       "access plus 0 seconds"
 
 
 
# Your document html
  ExpiresByType text/html                 "access plus 0 seconds"
   
# Data
  ExpiresByType text/xml                  "access plus 0 seconds"
  ExpiresByType application/xml           "access plus 0 seconds"
  ExpiresByType application/json          "access plus 0 seconds"
 
# RSS feed
  ExpiresByType application/rss+xml       "access plus 1 hour"
 
# Favicon (cannot be renamed)
  ExpiresByType image/x-icon              "access plus 1 week"
 
# Media: images, video, audio
  ExpiresByType image/gif                 "access plus 1 month"
  ExpiresByType image/png                 "access plus 1 month"
  ExpiresByType image/jpg                 "access plus 1 month"
  ExpiresByType image/jpeg                "access plus 1 month"
  ExpiresByType video/ogg                 "access plus 1 month"
  ExpiresByType audio/ogg                 "access plus 1 month"
  ExpiresByType video/mp4                 "access plus 1 month"
  ExpiresByType video/webm                "access plus 1 month"
   
# HTC files  (css3pie)
  ExpiresByType text/x-component          "access plus 1 month"
   
# Webfonts
  ExpiresByType font/truetype             "access plus 1 month"
  ExpiresByType font/opentype             "access plus 1 month"
  ExpiresByType application/x-font-woff   "access plus 1 month"
  ExpiresByType image/svg+xml             "access plus 1 month"
  ExpiresByType application/vnd.ms-fontobject "access plus 1 month"
     
# CSS and JavaScript
  ExpiresByType text/css                  "access plus 1 year"
  ExpiresByType application/javascript    "access plus 1 year"
  ExpiresByType text/javascript           "access plus 1 year"
   
  <IfModule mod_headers.c>
    Header append Cache-Control "public"
  </IfModule>
   
</IfModule>

Copy This Into Your htaccess

File_Exists For Remote URL

If you want to make sure that a file exists in PHP you can use the function file_exists(), which takes one parameter of the filename.
// Returns true if the file exists
file_exists( $filename );
This function will not only work for files but will also work for directories, you can pass in a filename of the directory and if this directory exists the file_exists() function will return true.
<?php
$filename = '/path/to/foo.txt';

if (file_exists($filename)) {
    echo "The file $filename exists";
} else {
    echo "The file $filename does not exist";
}
?>
The problem I've seen with this function is that people have tried to use it when they want to see if a remote file exists using a URL. But if you try to search for a file exists using this function with a URL it will not work correctly, the function will always return false.
If you want to check if the file_exists() with a remote URL you need to make a HTTP request for this file and check what the HTTP header status are when the request returns.

Get Headers Of A URL

To get the headers of a remote file then you can use the PHP function of get_headers(). This takes a parameter of the URL you want to request and it will return an array of the headers returned. The first key of the array is the value we are interested in, this will return the header status of the HTTP. If the file exists the status will return a 200 code, if the remote file doesn't exist then the status will return a 404 error.
This means we can use function to check if the remote file exists.
$file_headers = @get_headers($url);
if($file_headers[0] == 'HTTP/1.0 404 Not Found')
{
   $file_exists = false;
} else {
   $file_exists = true;
}

Redirect HTTP to HTTPS

Since Google announced that they were going to start using HTTPS as part of their ranking you would of seen a vast amount of websites switching to use HTTPS instead of using HTTP.
If you are switching your site to use HTTPS you need to remember that all your pages that are indexed into Google or websites that are linking to your site will still be pointing to your old HTTP page. Therefore you need to make sure that you redirect all non-HTTPS pages to HTTPS.

Redirect With htaccess

If you are using an apache server then you can use the following code snippet and enter it into your htaccess file.
This will search for anything coming in on port 80 which is the default for HTTP and will redirect it to HTTPS.
RewriteEngine On 
RewriteCond %{SERVER_PORT} 80 
RewriteRule ^(.*)$ https://{HTTP_HOST}/$1 [R=301,L]

Redirect On Nginx

If you are using an Nginx then you can add the following into your server conf. This will listen to requests on port 80 and redirect to your server with a HTTPS prefix, all you have to do is replace theexample.com with your own domain.
server {
    listen      80;
    server_name example.com;
    return 301   https://$server_name$request_uri;
}