Friday, December 26, 2014

How to take Partial Backup Of Database in SQL Server

Hello Friend!!! Hope You are doing great. In this Small blog post I'm Going to explain how to take Partial Backup Of Database in SQL Server..

What Is Partial Backup?

As Per MSDN
"Partial backups are useful whenever you want to exclude read-only filegroupsA partial backup resembles a full database backup, but a partial backup does not contain all the filegroups. Instead, for a read-write database, a partial backup contains the data in the primary filegroup, every read-write filegroup, and, optionally, one or more read-only files. A partial backup of a read-only database contains only the primary filegroup."

This means that partial backups are only relevant for databases that contain read-only 
filegroups, If not then a partial backup will capture exactly the same data and objects as an equivalent full database backup.

Partial Backups are available to SQL Server 2005 and Later version. It was basically designed for large databases for which it will reduce backup and restore time.

How TO Take Partial Backup:

So Let see How to take Partial backup of SQL Server Database. Please use below script to create environment for same. You can not take partial Backup Using GUI so you have to use TSQL_script for same.

USE [master]
GO

CREATE DATABASE [PartialBackup] ON PRIMARY 
(   NAME = N'[PartialBackup]'
  , FILENAME = N'C:\Data\PartialBackups.mdf' 
  , SIZE = 10240KB  ), FILEGROUP [Read_only_Data] 
(   NAME = N'PartialBackup_ReadOnly'
  , FILENAME = N'C:\Data\PartialBackup_Read_Only.ndf' 
  , SIZE = 10240KB  ) LOG ON 
(   NAME = N'PartialBackup_log'
  , FILENAME = N'C:\Data\PartialBackups_log.ldf' 
  , SIZE = 10240KB )
GO

ALTER DATABASE [PartialBackup] SET RECOVERY SIMPLE
GO


Lets insert some data then we will change the filegroup "Read_only_Data" as read only.

USE [PartialBackup]
GO

CREATE TABLE dbo.RegistrationTable
    (
      ID INT IDENTITY  ,
      Name varchar(20)NOT NULL
    )
ON  [PRIMARY]
GO

CREATE TABLE dbo.Read_only_table
    (
      ID INT IDENTITY  ,
     Name varchar(20) NOT NULL
    )
ON  [Read_only_Data]
GO
INSERT  INTO dbo.RegistrationTable
VALUES  ('ABC'),('DEF'),('ETC')
INSERT  INTO dbo.Read_only_table
VALUES   ('XXX'),('YYY'),('ZZZ')

GO



Now Lets modify the FileGroup named as "Read_only_Data" to read only:

ALTER DATABASE [PartialBackup] MODIFY FILEGROUP [Read_only_Data] READONLY
GO

Now, before we take our first partial backup, we will take one backup copy of the whole database, including the read-only data, as the basis for any subsequent restore operations. 
By the way We can take a partial before taking a full database backup. But if you don't have a single full backup then from where you will restore read_only files in case of failure. So, its a good habit to take a full backup before partial backups.

USE [master]
GO
BACKUP DATABASE PartialBackup
TO DISK = N'C:\Data\PartialBackup_FULL.bak'
GO


We can see from Output that it processes both of our data files (Primary and Read_only_Data), plus the log file.

INSERT  INTO  PartialBackup.dbo.RegistrationTable
VALUES  ('WWW'),('RRRR'),('TTTTTT')
GO

Now take partial backup of database as:

Use PartialBackup

Go
BACKUP DATABASE PartialBackup READ_WRITE_FILEGROUPS
TO DISK = N'C:\Data\PartialBackup_PARTIAL_KA_Full.bak'
GO


We just added  READ_WRITE_FILEGROUPS in script to take partial backup as we want to exclude read_only filegroups from our backup.

We can see from the output that only the primary data file and the log file are processed. So we are good till now as we only wanted to exclude read_only FileGroup.

Now Lets see how to take Differential Partial Backup:

As we can Take differential database backups, which is based on a full database backup that is known as Base for differential backups. We can also take differential partial database backups that refer to a base partial database backup, and will capture only the data that changed in the read-write data files, since the base partial backup was taken.

Before taking partial differential backup insert few rows so that they can be included in your backup:

USE [PartialBackup]
GO
INSERT  INTO RegistrationTable
VALUES  ('FFFF'),(RTYU')
GO

We can use below script to take partial differential backup of database.

USE [master]
GO
BACKUP DATABASE [PartialBackup] READ_WRITE_FILEGROUPS
TO DISK = N'C:\Data\PartialBackup_PARTIAL_KA_Diff.bak'
WITH DIFFERENTIAL
GO

So this is how we can take partial backup of our database in SQL Server. In Next Blog I'm going to explain how to restore partial backups. It is almost same as normal backup restore process. Till then Enjoy. Wish you A Very Happy Weekends!!!

Thanks For Reading This Blog!!!

Tuesday, December 23, 2014

A small Twist in Full Recovery Model Of SQL Server...

Hi Friends!! In this small blog post I'm going to tell you a very interesting thing about Full Recovery model. I know most of peoples are aware about it but in case if you don't then I'm sure you will not believe me about this .  :) :)

As we all know we have three types of recovery model in SQL server which we use according to business requirement:

Simple Recovery Model
Bulked logged Recovery model
Full Recovery Model

For more information around recovery model in SQL Server you can visit Below Page:
http://msdn.microsoft.com/en-IN/library/ms189275.aspx

In this blog Post I'm going to tell a secret about full Recovery model. 

So, Let's assume that you a database "TestVK" in simple recovery mode which is running from last one month. Today morning when you reached to office, your manager came to you and asked to changed the recovery model from simple to Full Recovery . You agreed and and then from database options you changed it to Full Recovery model. You confirmed your manager that job is done.

Now my questions is: Will the database "TestVK" work as Full Recovery model From Now Onwards??

If your answer is "NO" then close this tab and ENJOY!!! :)

If your answer is "YESSSS.. It will work as Full Recovery Model" then please wait because I think this blog is for you :)

I know you will say that this blogger crazy??.. As you just confirmed it from sys.database and itis giving the recovery model as "Full" then how can I say it is not working as Full Recovery Model.

select name,recovery_model_desc from sys.databases where name='testvk'

But let me tell you the fact that " Full recovery model does not work as "FULL" until unless you take a Full Backup of your database". As we use Full recovery model to prevent truncation of transaction log of your database. It can only be truncated by a transaction log Backup of that database except few cases. And during PIT (Point in Time) restore you need a base that is your Full Backup. And if you don't have any full backup then what is the use of transaction log. That is why when we don't take any full backup, Sql server work as Simple recovery model. As Simple recovery model Automatically reclaims log space to keep space requirements small, essentially eliminating the need to manage the transaction log space. 

Let me show you same in SSMS. Lets crate a database "TestVK". It was initially in simple so I have changed it to Full Recovery Model. Go to options and disable auto log growth. Now create a Table and Insert the data in table to generate logs as it is in Full recovery model so log should not be truncated. Am I Right?? Okay then let see what happens here. 


Use below script to create environment:

USE MASTER
GO
CREATE DATABASE TESTVK
Go
Alter database TestVK Set Recovery Full
Go
Create table Test1 (Id int Identity(1,1), name char(500))
Go



Lets Verify the recovery model of database using Sys.databases: 

select name,recovery_model_desc from sys.databases where name='TestVK'

Output:
name recovery_model_desc
TestVK FULL



Now Disable auto log growth. 

Check the Log usage of this database using DBCC SQLPERF(Logspace). If you want more information around SQLPERF(Logspace) Please visit my old blog post on same:

How to see the log usage information of databases??

Now Execute: DBCC SQLPERF(LOGSPACE)

Output:
Database Name Log Size (MB) Log Space Used (%) Status

TESTVK                   0.5546875 42.78169                     0

Now lets insert the data into Test1 table and will monitor the log space usage:

insert into Test1 values ('vimal')
go 200

Execute
DBCC SQLPERF(LOGSPACE)

Output:

Database Name Log Size (MB) Log Space Used (%) Status

TESTVK                        0.5546875 80.72183                     0

Again Insert:

insert into Test1 values ('vimal')
go 200

Execute
DBCC SQLPERF(LOGSPACE)

Output:
Database Name Log Size (MB) Log Space Used (%)                Status

TESTVK                   0.5546875 41.10915 (Reduced)                    0


From above log usage output we can see that Log is getting managed in 1MB of File.
It means that although it is in Full Recovery Model but internally it is working as Simple Recovery Model.

Now lets take the Full back of database and then we will monitor the log space usage:

BACKUP DATABASE [TestVK] TO  DISK = N'D:\TestVK_backup_2014_12_23_220007_1630000.bak' 
WITH stats=10
GO

OUTPUT
12 percent processed.
21 percent processed.
31 percent processed.
40 percent processed.
52 percent processed.
62 percent processed.
71 percent processed.
80 percent processed.
90 percent processed.
Processed 256 pages for database 'TESTVK', file 'TESTVK' on file 2.
100 percent processed.
Processed 1 pages for database 'TESTVK', file 'TESTVK_log' on file 2.

BACKUP DATABASE successfully processed 257 pages in 0.216 seconds (9.295 MB/sec).

Now insert
insert into Test1 values ('vimal')
go 200

Execute
DBCC SQLPERF(LOGSPACE)

Output:

Database Name Log Size (MB) Log Space Used (%) Status

TESTVK                     0.5546875 78.34507                  0

Again insert Data in table:
insert into Test1 values ('vimal')
go 2000

Error 
** An error was encountered during execution of batch. Continuing.
Msg 9002, Level 17, State 2, Line 1
The transaction log for database 'TestVK' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases

Execute
DBCC SQLPERF(LOGSPACE)

Output:
Database Name Log Size (MB) Log Space Used (%) Status

TESTVK                            0.5546875         100                      0




Ooooops Log FULL :) :)

So now its clear that it is working as full Recovery model. I hope you got my point what I want to explain here.

So, Next time when you are changing recovery model from simple to Full please do not forget to take a full backup.



Thanks For reading this Blog!!!




Monday, December 22, 2014

Difference Between SET and SELECT Statement In SQL Server..

Although there are plenty of blog's where you will get the differences between SET and SELECT. But there are only few where you will get logical and very important difference which should be taken care at the time of code writing.

Basically, SET is SQL ANSI standard for settings variables, SELECT is not. 

SET works only for single assignments, SELECT can do multiple assignments. 

The important point on which I'm going to focus is:

"A SELECT that does not fetch anything also does not set anything But its not the case in SET."

Let see the same in SSMS.

Use Master
Go
Create database  TestVK
Go
Use TestVK
Go
Create table Test1 (id int)
Go
Insert into Test1 values(1),(3)
Go
Select * from Test1

Now open a run the below queries and see the differences in output:

Declare @id int =-1
select @id = id from Test1 where id =7;
select @id

Output: -1

Which is incorrect. From here its clear that A SELECT that does not select or fetch anything also does not set anything But its not the case in SET 

Execute the below and check the result:

Declare @id int =-1
Set @id = (select id from Test1 where id =7) 
select @id

Output: NULL

I hope now its clear from above example. Please take care of same while writing codes.


Thanks For Reading This Blog!!!

Error "Msg 515, Level 16, State 2, Line 2.Cannot insert the value NULL into column 'Age', table 'tempdb.dbo.#t1_" column does not allow nulls. INSERT fails.

In this small blog post, I have something very interesting about temporary tables. So let me tell you how I got above titled error. Last week I have created a simple store procedure using temporary table on my office system and executed it. It was running fine.But when I ran it on my laptop it was giving error as below:

"Msg 515, Level 16, State 2, Line 2 Cannot insert the value NULL into column 'age', table 'tempdb.dbo.#T1_"  column does not allow nulls. INSERT fails.

I was surprised as it was working fine in Office. Next day when I reached to office I ran the same store procedure on my office system and again it was working fine. Now I was having clear idea that it is happening because of some connection problem. Without wasting a second I opened my laptop and connected to SSMS and without any confusion I just clicked on Tools> Options>Query Execution> Sql Server> ANSI. 

Please do not assume that I'm so intelligent as how quickly I found the issue. Actually what really happened is that two days back I was playing with query execution Options SET ANSI_NULL_DFLT_ON and that time I unchecked it and forgot to check it again. That is why I got clear idea where the problem is.

As per MSDN:
 When SET ANSI_NULL_DFLT_ON is ON, new columns created by using the ALTER TABLE and CREATE TABLE  statements allow null values if the nullability status of the column is not explicitly specified.  SET ANSI_NULL_DFLT_ON does not affect columns created with an explicit NULL or NOT NULL.

 As in my office system it was set to ON so Null value gets inserted in temporary table.
 But in my laptop it was set to OFF and when temp table created it was not null By default
 so it was not able to insert null Value in it.

Lets create a sample scenario to explain the same:  I will write a sample TSQL code to explain it.
Use master
GO
Create database TestVK
Go
Use TestVK
Go
Create Table #T1(id int, name char(5) , age int);
Go
Insert into #T1 values(1,'Arman',null)
Go
select * from #T1

GO
It will be inserted successfully.

Lets check the Null-ability status of this:
SELECT V.TABLE_NAME, V.COLUMN_NAME, V.IS_NULLABLE
FROM tempdb.INFORMATION_SCHEMA.COLUMNS AS V
WHERE V.TABLE_NAME LIKE '#T1%';
GO



From above screen shot it is clear that all column are nullable.

Now close you query window and Go to Tools> Options>Query Execution> Sql Server> ANSI and unchecked the ANSI_NULL_DFLT_ON.


Now open a new query and execute below query and see the output:

Use TestVK;
Create Table #T1(id int, name char(5) , age int);
Go
Insert into #T1 values(1,'Zoya',null)
Go

OUTPUT:

Msg 515, Level 16, State 2, Line 1
Cannot insert the value NULL into column 'age', table 
'tempdb.dbo.#T1_____________________000000000383'; column does not allow nulls. INSERT fails.
The statement has been terminated.



Its because ANSI_NULL_DFLT_ON is OFF and the column for this table is set to not null. you can see the status of column by:

SELECT V.TABLE_NAME, V.COLUMN_NAME, V.IS_NULLABLE
FROM tempdb.INFORMATION_SCHEMA.COLUMNS AS V
WHERE V.TABLE_NAME LIKE '#T1%';
GO
Output:



So next time when you are using temporary table in your procedure take care of this before going into big trouble. The precaution you can take is that Always define the Null-ability when defining tables – temporary or others.

 Thanks For Reading this Blog!!!




Friday, December 19, 2014

How To Offline Secondary FileGroup in Sql Server???

In this small blog post I’m going to explain how to offline your secondary filegroup in sql server. There may be a situation when your drive gets corrupted and the secondary filegroup becomes unavailable. In this case you can do Filegroup restore of database while your primary filegroup will be available to users and your secondary file group will be offline till your restore process gets complete. I’ll explain how to do Filegroup restore of  your database in future post. Before applying Filegroup restore of your database you have to offline your secondary file group. So let see how to make offline your secondary filegroups in Sql server:

I have a sample database FileGroup. Use below script to create the same:

CREATE DATABASE FILEGROUP
ON PRIMARY
(NAME = N'REGISTRATION',
FILENAME = N'C:\REGISTRATION.MDF'),
FILEGROUP REPORT
(NAME = N'REPORTFILES',
FILENAME = N'E:\REPORTFILES.NDF')
LOG ON
(NAME = N'FILEGROUPDB_LOG',
FILENAME = N'C:\FILEGROUPDB_LOG.LDF')
  
Let see the status of our database FileGroup. Execute the below Query to get the same.

Use FileGroup
Go
select file_id, name, state_desc, physical_name from sys.database_files

You will See the output as below:

file_id name                             state_desc  physical_name
1          Registration                   ONLINE               C:\Registration.mdf
2          FileGroupDB_log           ONLINE               C:\FileGroupDB_log.ldf
3          ReportFiles                   ONLINE                E:\ReportFiles.ndf


As you can see right now all our filegroups are online. Now suppose E drive corrupted and report files become unavailable.

So now to  restore you have to offline the secondary file group i.e. ReportFiles

We can offline this filegroup by executing below query:

ALTER DATABASE FILEGROUP MODIFY FILE(NAME = REPORTFILES, OFFLINE)

Where..
FILEGROUP= Database  Name
REPORTFILES= FileGroup Name


The message in output box is "Nonqualified transactions are being rolled back. Estimated rollback completion: 100%.". It indicates that there were users found in the principal when you issued the command and these users have to be killed and their transactions rolled back.

Now Let see the status of files by executing below query:

Use FileGroup
Go
select file_id, name, state_desc, physical_name from sys.database_files


 As you can see from screen shot the ReportFiles is in OFFLINE state.


Thanks For Reading This Blog!!!!



  

Thursday, December 18, 2014

Error "916".. Server Principal "Login_name" is not able to access the database "database_name" under the current security context...

I got a call from one of my colleague from another project saying that he is able to connect to server but not able to expand the database. While trying to expand database an error "server principal "Login_name" is not able to access the database "database_name" under the current security context" is coming  with error number 916." is coming.

I asked him to send the screen shot for same. Until I received the screen shot I thought he is getting error while clicking on particular database. So first thing which came to my mind was may be he don't have permission to access that database. I was having admin rights to his server so I connected to SQL Server with SA account and found that he already has required access rights on that database. 

I checked his mail and looking at screen shot I found that he is not able to expand the database tab not "particular database". Below is the screen shot 



As last year I had already worked on same issue so I was having clear idea about this but this was not documented in blog post so thought to do so.

Here are few reasons because of that you can get SQL Server Error 916:

1: A login does not have permission to view data of a column selected in the header
2: A database is offline and/or its collation is NULL
3: Multiple databases with different collations are on the instance and SSMS is unable to retrieve the collation because a database is configured to auto-close

In my case the reason was 3rd point "unable to retrieve the collation because a database is configured to auto-close".

If you want more information about "Auto_close" option Please click on below link.

Database Auto options in sql server

In my case they are using Sql server express Edition and Auto_close option is set to "True" by default in express Edition.To solve this problem One thing you can do is that change the Auto_close option to False. If not then follow the below steps:

1: Open SSMS and connect it.
2: Open Object Explorer Details window by clicking on “View” in the Menu bar –> 
            Select “Object Explorer Details” in menu (Or Just Press F7)
3: In Object Explorer window just click at Databases folder
4: In Object Explorer Details window right-click at the column header and uncheck “Collation”
5: Refresh Databases folder in Object Explorer.




I performed the above steps on server and the problem was solved.  


Thanks For Reading this Blog!!!





Database Auto Options In SQL Server...

What does this auto option means in SQL Server database.

AUTO_CLOSE { ON | OFF }
AUTO_CREATE_STATISTICS { ON | OFF }
AUTO_SHRINK { ON | OFF }
AUTO_UPDATE_STATISTICS { ON | OFF }
AUTO_UPDATE_STATISTICS_ASYNC { ON | OFF }



The auto options affect actions that SQL Server might take automatically. All these options are Boolean options, with a value of True or False.

AUTO_CLOSE((Default False)) When this option is set to True, the database is closed and shut down cleanly when the last user of the database exits, thereby freeing any resources. 
All file handles are closed, and all in-memory structures are removed so that the database isn’t using any memory. When a user tries to use the database again, it reopens. If the database was shut down cleanly, the database isn’t initialized (reopened) until a user tries to use the database the next time SQL Server is restarted.

By default it is set to False in Enterprise edition but in Sql Server Express edition it is set to True.

AUTO_SHRINK(Default False) When this option is set to True, all of a database’s files are candidates for periodic shrinking. Both data files and log files can be automatically shrunk by SQL Server. The only way to free space in the log files so that they can be shrunk is to back up the transaction log or set the recovery model to SIMPLE. The log files shrink at the point that the log is backed up or truncated. This option is never recommended.

AUTO_CREATE_STATISTICS((Default True)) When this option is set to True (the default), the SQL Server Query Optimizer creates statistics on columns referenced in a query’s WHERE, ON, GROUP BY, or DISTINCT clauses. Adding statistics improves query performance because the SQL Server Query Optimizer can better determine how to evaluate a query.

AUTO_UPDATE_STATISTICS((Default True)) When this option is set to True (the default), existing statistics are updated if the data in the tables has changed. SQL Server keeps a counter (colmodctrs) of the modifications made to a table and uses it to determine 
when statistics are outdated.You can not see the metadata of counter without using DAC account.  When this option is set to False, existing statistics aren't automatically updated. 
(They can be updated manually.)

Statistics are checked before query compilation or before executing a cached query plan. Statistics are considered out-of-date when:
There was a data change on an empty table.
The number of rows in the table was 500 or less at the time of statistics creation and the column modification counter of the leading column of the statistics object has changed by more than 500 since then.
The table had more than 500 rows when the statistics were gathered, and the column modification counter of the leading column of the statistics object has changed by more than 500 + 20% of the number of rows in the table when the statistics were gathered.
A table in TempDB with less than 6 rows had at least 6 row modifications. 

AUTO_UPDATE_STATISTICS_ASYNC((Default False)):Introduced in SQL Server 2005. This allows the statistics update operation to be performed on a background thread in a different transaction context. When this option is enabled, the Query Optimizer will not wait for the update of statistics, but will run the query first and update the outdated statistics afterwards. Your query will execute with the current statistics and a background process will start to update the statistics in a separate thread. It should be noted that the Query Optimizer may choose a sub-optimal query plan if statistics are outdated when the query compiles. When this background operation is complete, the new query requests will use the new updated statistics.


Thanks For Reading this Blog!!

References: Sql Server Internals