Wednesday, August 27, 2008

When you last changed password of sa

Changing the passwords regularly is an important requirement of robust security implementation. And it becomes more important when login is sa. To analyze that when last time you implemented this good practice, use following script.

USE Master
GO

SELECT sid, [name], createdate, updatedate
FROM master.dbo.syslogins
WHERE [name] = 'sa'
GO


And if specific to SQL Server 2005 then

USE Master
GO

SELECT [name], sid, create_date, modify_date
FROM sys.sql_logins
WHERE [name] = 'sa'
GO



The script will return name, sid, creation date and update/modification date of all of your logins. As sa account can not be updated/modified except that of password, so update date here will be the date when password of sa was changed. And if no name condition is provided then update date tells you any last any update for other logins.

USE Master
GO

SELECT [name], sid, create_date, modify_date
FROM sys.sql_logins
GO

Saturday, August 23, 2008

Copy or move table from one Database to another

Suppose we have a table Tab1 in database DB1. And it is required to move the Tab1 to another database DB2. So it is assumed that DB2 already exists with no or some tables in it. First of all use following T-SQL command to copy table in to destination database with all data in it.

SELECT *
INTO DB2..Tab1
FROM DB1..Tab1
GO


Now if you have to just copy the table then its over but if you are required to move the table then use following command to drop the table from source database.

DROP TABLE DB1.Tab1
GO


As you can see that it is done by select statement, so no affiliated objects like triggers, views etc will be copied.

Thursday, August 21, 2008

Data Type Precedence (T-SQL)

When two expressions of different data types are combined by an operator, the data type precedence rules specify which data type is converted to the other. The data type with the lower precedence is converted to the data type with the higher precedence. If the conversion is not a supported implicit conversion, an error is returned. When both operand expressions have the same data type, the result of the operation has that data type.

This is the precedence order for the Microsoft® SQL Server™ data types:

* datetime (highest)
* smalldatetime
* float
* real
* decimal
* money
* smallmoney
* int
* smallint
* tinyint
* bit
* ntext
* text
* image
* timestamp
* nvarchar
* nchar
* varchar
* char
* varbinary
* binary
* uniqueidentifier (lowest)

Data Type Precedence and Comparison Operators

These data type precedence rules are modified by the comparison operators. Comparison operators always return a Boolean data type. When a comparison operator is applied to expressions of two different data types, one of the expressions must be converted to the data type of the other before the comparison can be made. The data type precedence rules control this internal conversion, with the following exceptions:

* If one of the expressions is an aggregate function that is not in a subquery, and the other expression is not an aggregate, then the data type of the aggregate function is used regardless of the precedence rules.

* If one of the expressions is a column, and the other is not a column or aggregate function, the data type of the column is used regardless of the precedence rules.

Wednesday, August 20, 2008

List permissions on SQL Server 2000 objects through sp_helpProtect()

I was required to list all permissions on a DB or table. Although this can be done through EM/SSMS but the analysis of permissions is more efficient through T-SQL than by any other means. I have found some very use full T-SQL stored procedure sp_helpProtect in this regard. It lists users permissions for an object.
General syntax for sp_helpProtect is

sp_helprotect [ [ @name = ] 'object_statement' ]

[ , [ @username = ] 'security_account' ]

[ , [ @grantorname = ] 'grantor' ]

[ , [ @permissionarea = ] 'type' ]


You may use it in following ways

--List all user permissions of all Database objects
sp_helprotect
GO

-- List all user permissions of a table
sp_helprotect 'tableNme'
GO

-- List all user permissions of stored procedure
EXEC sp_helprotect 'spName'
GO

-- List all user permissions of sp granted by dbo
sp_helprotect 'spName', NULL,dbo
GO

-- List all Object type user permissions
sp_helprotect NULL, NULL,NULL,'o'
GO

-- List all statement type user permissions
sp_helprotect NULL, NULL,NULL,'s'
GO


-- List all permissions for a user
sp_helprotect NULL,'user'
GO


Note: Basically sp_helprotect is for SQL Server 2000.For SQL Server 2005 and later sp_helprotect has no information regarding securables that were introduced in SQL Server 2005. Read here about List the permissions on SQL Server 2005 objects by using sys.database_permissions and fn_builtin_permissions instead.

Friday, August 15, 2008

Satistics updation with Reindex

Question: Does SQL Server update statistics with Reindex on a table in SQL Server 2005?


Answer: Yes, with full table scan details


Explanation: Whenever an index is rebuilt the stats on the table is updated with the details of a full table scan.

Ref: Search Engine Q&A #10: Rebuilding Indexes and Updating Statistics -


Note: The Question of day is taken from SQLServercentral.com. I want to compile selected QOD.

Tuesday, August 12, 2008

DBA Best Practices for Queries and Stored Procedures

Queries and stored procedures are close companions of a DBA. So make your life easy by following the baseline given below, while writing queries and stored procedures by tomorrow.
  1. Maintain all code in a source control system.
  2. Keep transactions as short as possible. This reduces locking and increases application concurrency, which helps to boost performance.
  3. Avoid using query hints unless you know exactly what you are doing, and you have verified that the hint actually boosts performance.
  4. Encapsulate all transactions within stored procedures, including both the BEGIN TRANSACTION and COMMIT TRANSACTION statements in the procedure.
  5. Use the least restrictive transaction isolation level possible for your user connection, instead of always using the default READ COMMITTED.
  6. SET NOCOUNT ON at the beginning of each stored procedure you write. This statement should be included in every stored procedure you write.
  7. Before you are done with your stored procedure code, review it for any unused code, parameters, or variables that you may have forgotten to remove while you were making changes, and remove them.
  8. For best performance, all objects that are called within the same stored procedure should all be owned by the same object owner or schema, preferably dbo, and should also be referred to in the format of object_owner. object_name or schema_owner.object_ name.
  9. One way to help ensure that stored procedures query plans are reused from execution to execution of the same stored procedure is to ensure that any user connections information, SET options, database options, or SQL Server configuration options don’t change from execution to execution of the same stored procedure. If they do change, then SQL Server may consider these same stored procedures to be different, and not be able to reuse the current query plan stored in cache.
  10. Whenever a client application needs to send Transact-SQL to SQL Server, send it in the form of a stored procedure instead of a script or embedded Transact-SQL. Stored procedures offer many benefits, including:
  11. Reduced network traffic and latency, boosting application performance.
  12. Stored procedure execution plans can be reused, staying cached in SQL Server’s memory, reducing server overhead.
  13. Client execution requests are more efficient. For example, if an application needs to INSERT a large binary value into an image data column not using a stored procedure, it must convert the binary value to a character string (which doubles its size), and send it to SQL Server. When SQL Server receives it, it then must convert the character value back to the binary format. This is a lot of wasted overhead. A stored procedure eliminates this issue as parameter values stay in the binary format all the way from the application to SQL Server, reducing overhead and boosting performance.
  14. Stored procedures help promote code reuse. While this does not directly boost an application’s performance, it can boost the productivity of developers by reducing the amount of code required, along with reducing debugging time.
  15. Stored procedures can encapsulate logic. You can change stored procedure code without affecting clients (assuming you keep the parameters the same and don’t remove any result sets columns). This saves developer time.
  16. Stored procedures provide better security to your data.
A part chosen from SQL Server DBA best practices By Brad M.mcGehee

Monday, August 11, 2008

DBA Best Practices for Database Design

Good database design will be a luxury for a life time. Follow these basic guide lines to give a good database design.
  1. Bad logical database design results in bad physical database design, and generally results in poor database performance. So, if it is your responsibility to design a database from scratch, be sure you take the necessary time and effort to get the logical database design right. Once the logical design is right, then you also need to take the time to get the physical design right.
  2. Normalize your data to ensure best performance.
  3. Take advantage of SQL Server’s built-in referential integrity. You don’t need to write your own.
  4. Always specify the narrowest columns you can. In addition, always choose the smallest data type you need to hold the data you need to store in a column. The narrower the column, the less amount of data SQL Server has to store, and the faster SQL Server is able to read and write data.
  5. Try to avoid performing both OLTP and OLAP transactions within the same database.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for Indexing

Indexing is like nervous system of DB optimization. It has terrific impact on performance as you configure it. Follow these use full guidelines to optimize the performance of a SQL Server database.
  1. Periodically, run the Index Wizard or Database Engine Tuning Advisor against current Profiler traces to identify potentially missing indexes.
  2. Remove indexes that are never used.
  3. Don’t accidentally create redundant indexes.
  4. As a rule of thumb, every table should have at least a clustered index. Generally, but not always, the clustered index should be on a column that monotonically increases — such as an identity column, or some other column where the value is increasing — and is unique. In many cases, the primary key is the ideal column for a clustered index.
  5. Since you can only create one clustered index per table, take extra time to carefully consider how it will be used. Consider the type of queries that will be used against the table, and make an educated guess as to which query (the most common one run against the table, perhaps) is the most critical, and if this query will benefit from having a clustered index.
  6. If a column in a table is not at least 95% unique, then most likely the query optimizer will not use a non-clustered index based on that column. Because of this, you generally don’t want to add non-clustered indexes to columns that aren’t at least 95% unique.
  7. Keep the “width” of your indexes as narrow as possible. This reduces the size of the index and reduces the number of disk I/O reads required to read the index, boosting performance.
  8. If possible, avoid adding a clustered index to a GUID column (uniqueidentifier data type). GUIDs take up 16-bytes of storage, more than an Identify column, which makes the index larger, which increases I/O reads, which can hurt performance.
  9. Indexes should be considered on all columns that are frequently accessed by the JOIN, WHERE, ORDER BY, GROUP BY, TOP, and DISTINCT clauses.
  10. Don’t automatically add indexes on a table because it seems like the right thing to do. Only add indexes if you know that they will be used by the queries run against the table.
  11. When creating indexes, try to make them unique indexes if at all possible. SQL Server can often search through a unique index faster than a non-unique index because in a unique index, each row is unique, and once the needed record is found, SQL Server doesn’t have to look any further.
  12. If you perform regular joins between two or more tables in your queries, performance will be optimized if each of the joined columns have appropriate indexes.
  13. Don’t automatically accept the default value of 100 for the fill factor for your indexes. It may or may not best meet your needs. A high fill factor is good for seldom changed data, but highly modified data needs a lower fill factor to reduce page splitting.
  14. Don’t over index your OLTP tables, as every index you add increases the time it takes to perform INSERTS, UPDATES, and DELETES. There is a fine line between having the ideal number of indexes (for SELECTs) and the ideal number to minimize the overhead that occurs with indexes during data modifications.
  15. If you know that your application will be performing the same query over and over on the same table, consider creating a non-clustered covering index on the table. A covering index, which is a form of a composite index, includes all of the columns referenced in SELECT,JOIN, and WHERE clauses of a query. Because of this, the index contains the data you are looking for and SQL Server doesn’t have to look up the actual data in the table, reducing logical and/or physical I/O, and boosting performance.
A part chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for Hardware Performance Tuning

You have upgraded your hardware to new cutting edge technology server machine. Congratulations but do not forget that its the person on machine that matters more than machine. Its right time to tune your new machine to get most out of it. Poor tuning may lead to poor performance no matter how sophisticated or state of art is your server.
  1. Although heavy-duty hardware can help SQL Server’s performance, application and database design can play a greater part in overall performance than hardware. Keep this in mind, as throwing good money after bad on server hardware does not always fix SQL Server performance problems. Before getting faster hardware, be sure you have thoroughly tuned your applications, Transact-SQL, and database indexing.
  2. In many cases, adding RAM to a server is the cheapest and fastest way to boost hardware performance of a SQL Server. But before adding more RAM to a SQL Server, ensure first that it will be used by SQL Server. Adding more RAM doesn’t mean that SQL Server will always use it. If the current Buffer Hit Cache Ratio is consistently above 99% and you have well more than 10 MB of Available RAM, your server won’t benefit from adding additional RAM.
  3. If your SQL Server’s total CPU utilization is consistently above 80% or more, you need more CPUs, faster CPUs, or you need to find a way to reduce the load on the current server.
  4. If the Physical Disk Object: % Disk Time counter exceeds 55%, and the Physical Disk Object: Avg. Disk Queue Length exceeds a count of 2 for each individual disk drive in your disk storage subsystem, then you most likely experiencing a disk I/O performance issue and need to start looking for solutions.
  5. Don’t run any applications on your server other than SQL Server, with the exception of necessary utilities.
  6. NTFS-formatted partitions should not exceed 80% of their capacity. For example, if you have a 100GB logical drive, it should never be fuller than 80GB. Why? NTFS needs room to work, and when you exceed 80% capacity, NTFS become less efficient and I/O can suffer for it.
  7. If your SQL Server database is mostly reads, then a RAID 5 array offers good protection and adequate performance. If your SQL Server database is mostly writes, then use a RAID 10 array for best protection and performance.
  8. If your SQL Server’s tempdb database is heavily used by your application(s), consider locating it on an array of its own (such as RAID 1 or RAID 10). This will allow disk I/O to be more evenly distributed, reducing disk I/O contention issues, and speeding up SQL Server’s overall performance.
  9. The more spindles you have in an array, the faster disk I/O will be.
  10. Ensure that all hardware is running the latest, approved drivers.
Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for Performance Monitoring

Not always hardware bottle necks are responsible for performance degradation. there are various possible reasons. And to detect the right culprit DBA should pay special attention to performance monitoring. For this purpose
  1. Regularly monitor your SQL Servers for blocked transactions.
  2. Regularly monitor system performance using System Monitor. Use System Monitor for both real-time analysis andfor historical/baseline analysis.
  3. If running SQL Server 2005, SP2 or later, install the free SQL ServerPerformance Dashboard. It can be usedfor real-time monitoring and performancetroubleshooting.
  4. Regularly monitor activity using Profiler.
  5. Be sure that traces are taken duringthe busiest times of the day so you get a more representative trace of what isgoing on in each server. When runningthe Profiler, do not collect more datathan you need to collect.
  6. Perform performance monitoring from a computer that is not the SQL Server you are monitoring. Run monitoring tools on a separate desktop or server.
Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for Backup

Backup is the skill that no one else in organization more than a DBA itself. It is ultimate shield and optimized back plan may be of great advantages. Following are some baselines for database backups

  1. All production databases should be set to use the full recovery model. This way, you can create transaction log backups on a periodic basis.
  2. Whenever possible, perform a daily full backup of all system and user databases.
  3. For all production databases, perform regular transaction log backups, at least once an hour.
  4. Perform full backups during periods of low user activity in order to minimize the impact of backups on users.
  5. Periodically test backups to ensure that they are good and can be restored.
  6. Backup first to disk, then move to tape or some other form of backup media.
  7. Store backups offsite.
  8. If using SQL Server 2005 encryption, be sure to backup the service master key, database master keys, and certificates.
  9. If you find that backup times take longer than your backup window, or if backup file sizes are taking up too much space on your storage device, consider a third party backup program, such as SQL Backup Pro or SQL tool belt.
  10. Document, step-by-step, the process to restore system and user databases onto the same, or a different server. You don’t want to be looking this information up during an emergency.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

Saturday, August 9, 2008

DBA Best Practices for Disaster Recovery

keeping in view the sensitivity of topic, a DBA must pay full attention to be prepared for recovery from any type of disaster. Following are just very general guidelines, but the task requires planning and practice.

  1. You must create a disaster recovery plan and include every detail you will need to rebuild your servers.
  2. As your SQL Servers change over time, don’t forget to update your disaster recovery plan
  3. Write the disaster recovery plan so that any computer literate person will be able to read and follow it. Do not assume a DBA will be rebuilding the servers.
  4. Fully test your disaster recovery plan at least once a year.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for General High Availability

Due to growing data dependency of organizations, high availability is imperative. While implementing any configuration for high availability follow following base lines.

  1. Physically protect your SQL Servers from unauthorized users.
  2. Physically document all of your SQL Server instances. Incorporate effective change management.
  3. Always use a RAIDed array or SAN for storing your data.
  4. Use SQL Server clustering, database mirroring, or log shipping to provide extra fault tolerance.
  5. Replication is not an effective means to protect your data.
  6. Ensure that your entire IT infrastructure is redundant. It is only as strong as its weakest link.
  7. Always use server-class hardware, and standardize on the same hardware as much as possible.
  8. Use hardware and software monitoring tools so you can quickly become aware of when problems first arise.
  9. After testing, apply all new service packs and hot fixes to the OS and SQL Server.
  10. Cross-train staff so that there are multiple people who are able to deal with virtually any problem or issue.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for Replication

Following best practices will help you to configure a successful replication

  1. Replication needs should be clearly defined before creating a replication topology. Successful replication can be difficult and requires much pre-planning.
  2. Ideally publishers, distributors, and subscribers should be on separate physical hardware.
  3. Create, document, and test a backup and restore strategy. Restoring replicated databases can be complex and requires much planning and practice.
  4. Script the replication topology as part of your disaster recovery plan so you can easily recreate your replication topology if needed.
  5. Use default replication settings, unless you can ensure that a non-default setting will actually improve replication performance or other issues. Be sure that you test all changes to ensure that they are as effective as you expect.
  6. Fully understand the implications of adding or dropping articles, changing publication properties, and changing schema on published databases, before making any of these changes.
  7. Periodically, validate data between publishers and subscribers.
  8. Regularly monitor replication processes and jobs to ensure they are working.
  9. Regularly monitor replication performance, and performance tune as necessary.
  10. Add alerts to all replication jobs so you are notified of any job failures.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for Database Settings

If you are going to play with database settings then wait and read following base lines as best practices

  1. Unless you know exactly what you are doing and have already performed impartial experiments that prove that making SQL Server configuration changes helps you in your particular environment, do not change any of the SQL Server configuration settings.
  2. In almost all cases, leave the “auto create statistics” and “auto update statistics” options on for all user databases.
  3. In most cases, the settings for the “maximum server memory” and the “minimum server memory” should be left to their default values. This is because the default values allow SQL Server to dynamically allocate memory in the server for the best overall optimum performance. If you use AWE memory, then this recommendation is to be ignored, and maximum memory needs to be set manually.
  4. Many databases need to be shrunk periodically in order to free up disk space as older data is deleted from the database. But don’t be tempted to use the “auto shrink” database option, as it can waste SQL Server resources unnecessarily. Instead, shrink databases manually.
  5. Don’t rely on AUTOGROWTH to automatically manage the size of your databases. Instead, proactively monitor and alter database size as circumstances dictate. Only use AUTOGROWTH to deal with unexpected growth.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for Job Maintenance

Jobs are a great automation tool in SQL Server and like every great resource should be used with care and planing. Follow following base lines to get more out of this facility.
  1. Avoid overlapping jobs on the same SQL Server instance. Ideally, each job should run separately at different times.
  2. When creating jobs, be sure to include error trapping, log job activity, and set up alerts so you know instantly when a job fails.Create a special SQL Server login account whose sole purpose is to run jobs, and assign it to all jobs.
  3. If your jobs include Transact-SQL code, ensure that it is optimized to run efficiently.
  4. Periodically (daily, weekly, or monthly) perform a database reorganization on all the indexes on all the tables in all your database. This will rebuild the indexes so that the data is no longer logically fragmented. Fragmented data can cause SQL Server to perform unnecessary data reads, slowing down SQL Server’s performance. Reindexing tables will also update column statistics.
  5. Don’t reindex your tables when your database is in active production, as it can lock resources and cause your users performance problems. Reindexing should be scheduled during down times, or during light use of the databases.
  6. At least every two weeks, run DBCC CHECKDB on all your databases to verify database integrity.
  7. Avoid running most DBCC commands during busy times of the day. These commands are often I/O intensive and can reduce performance of the SQL Server, negatively affecting users.
  8. If you rarely restart the mssqlserver service, you may find that the current SQL Server log gets very large and takes a long time to load and view. You can truncate (essentially create a new log) the current server log by running DBCC ERRORLOG. Set this up as a weekly job.
  9. Script all jobs and store these scripts in a secure area so they can be used if you need to rebuild the servers.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for SQL Server Security

Following best practices may be implemented as base line for standard security of SQL Server


  1. Ensure the physical security of each SQL Server, preventing any unauthorized users to physically accessing your servers.
  2. Only install required network libraries and network protocols on your SQL Server instances.
  3. Minimize the number of sysadmins allowed to access SQL Server.
  4. As a DBA, log on with sysadmin privileges only when needed. Create separate accounts for DBAs to access SQL Server when sysadmin privileges are not needed.
  5. Assign the SA account a very obscure password, and never use it to log onto SQL Server. Use a Windows Authentication account to access SQL Server as a sysadmin instead.
  6. Give users the least amount of permissions they need to perform their job.
  7. Use stored procedures or views to allow users to access data instead of letting them directly access tables.
  8. When possible, use Windows Authentication logins instead of SQL Server logins.
  9. Use strong passwords for all SQL Server login accounts.
  10. Don’t grant permissions to the public database role.
  11. Remove user login IDs who no longer need access to SQL Server.
  12. Remove the guest user account from each user database.
  13. Disable cross database ownership chaining if not required.
  14. Never grant permission to the xp_cmdshell to non-sysadmins.
  15. Remove sample databases from all production SQL Server instances.
  16. Use Windows Global Groups, or SQL Server Roles to manage groups of users that need similar permissions.
  17. Avoid creating network shares on any SQL Server.
  18. Turn on login auditing so you can see who has succeeded, and failed, to login.
  19. Don’t use the SA account, or login IDs who are members of the Sysadmin group, as accounts used to access SQL Server from applications.
  20. Ensure that your SQL Servers are behind a firewall and are not exposed directly to the Internet.
  21. Remove the BUILTIN/Administrators group to prevent local server administrators from being able to access SQL Server. Before you do this on a clustered SQL Server, check Books Online for more information.
  22. Run each separate SQL Server service under a different Windows domain account
  23. Only give SQL Server service accounts the minimum rights and permissions needed to run the service. In most cases, local administrator rights are not required, and domain administrator rights are never needed. SQL Server setup will automatically configure service accounts with the necessary permissions for them to run correctly, you don’t have to do anything.
  24. When using distributed queries, use linked servers instead of remote servers.
  25. Do not browse the web from a SQL Server.
  26. Instead of installing virus protection on a SQL Server, perform virus scans from a remote server during a part of the day when user activity is less.
  27. Add operating system and SQL Server service packs and hot fixes soon after they are released and tested, as they often include security enhancements.
  28. Encrypt all SQL Server backups with a third-party backup tool, such as SQL Backup Pro.
  29. Only enable C2 auditing or Common Criteria compliance if required.
  30. Consider running a SQL Server security scanner against your SQL servers to identify security holes.
  31. Consider adding a certificate to your SQL Server instances and enable SSL or IPSEC for connections to clients.
  32. If using SQL Server 2005, enable password policy checking.
  33. If using SQL Server 2005, implement database encryption to protect confidential data.
  34. If using SQL Server 2005, don’t use the SQL Server Surface Area Configuration tool to unlock features you don’t absolutely need.
  35. If using SQL Server 2005 and you create endpoints, only grant CONNECT permissions to the logins that need access to them. Explicitly deny CONNECT permissions to endpoints that are not needed by users.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for Upgrading

DBA Best Practices for Upgrading
  1. Run the Upgrade Advisor before upgrading. Make any necessary changes before performing the upgrade.
  2. Perform a test upgrade of your test SQL Servers before you upgrade your production servers. And don’t forget to test your applications with the new version also.
  3. Before you upgrade, be sure you have a plan in place to fall back to in case the upgrade is problematic.
  4. Don’t upgrade SQL Server clusters in place. Instead, rebuild them on new hardware.
  5. If you upgrade from a previous version of SQL Server, you should update all of the statistics in all your databases using either UPDATE STATISTICS or sp_updatestats. This is because statistics are not automatically updated during the upgrade process.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices for Installation

For installation process following check list will be helpful
  1. Always fully document installs so that your SQL Server instances can easily be reproduced in an emergency.
  2. If possible, install and configure all of your SQL Server instances consistently, following an agreed-upon organization standard.
  3. Don’t install SQL Server services you don’t use, such as Microsoft Full- Text Search, Notification Services, or Analysis Services.
  4. For best performance of SQL Server running under Windows, turn off any operating system services that aren’t needed.
  5. For optimum SQL Server performance, you want to dedicate your physical servers to only running a single instance of SQL Server, no other applications.
  6. For best I/O performance, locate the database files (.mdf) and log files (.ldf) on separate arrays on your server to isolate potentially conflicting reads and writes.
  7. If tempdb will be used heavily, also put it on its own separate array.
  8. Do not install SQL Server on a domain controller.
  9. Be sure that SQL Server is installed on an NTFS partition.
  10. Don’t use NTFS data file encryption (EFS) and compression on SQL Server database and log files.

Chosen from SQL Server DBA best practices By Brad M.mcGehee

DBA Best Practices Day to Day

DBA may keep a defined set of practices that should be implemented daily to detect/overcome any anomaly. Following is a sketch of general daily practices for a DBA .
  1. Check OS Event Logs, SQL Server Logs, and Security Logs for unusual events.
  2. Verify that all scheduled jobs have run successfully.
  3. Confirm that backups have been made and successfully saved to a secure location.
  4. Monitor disk space to ensure your SQL Servers won’t run out of disk space.
  5. Throughout the day, periodically monitor performance using both System Monitor and Profiler.
  6. Use Enterprise Manager/Management Studio to monitor and identify blocking issues.
  7. Keep a log of any changes you make to servers, including documentation of any performance issues you identify and correct.
  8. Create SQL Server alerts to notify you of potential problems, and have them emailed to you. Take actions as needed.
  9. Run the SQL Server Best Practices Analyzer on each of your server’s instances on a periodic basis.
  10. Take some time to learn something new as a DBA to further your professional development.
Chosen from SQL Server DBA best practices By Brad M.mcGehee

Thursday, August 7, 2008

change compatibility level of a SQL Server database

I have a database with compatibility level 80 (upgraded from SQL Server 2000 to SQL Server 2005). Today when i tried to disable the trigger through following command

Use DBNameHere
GO

DISABLE TRIGGER triggerName
ON tablename
GO

Following error occurred


Msg 156, Level 15, State 1, Line 1

Incorrect syntax near the keyword 'TRIGGER'.


It was surprising as the command looked valid and used many times. It was unusual and i had no way but to disable it through SSMS for time being. Later i tried again in a different way and it worked through following command.


USE DBNameHere
GO

ALTER TABLE tableName
DISABLE TRIGGER TriggerName
GO


So if you ever get the above mentioned error on enable/disable command then it may be compatibility level issue. Because disable trigger is SQL Server 2005 command. As compatibility level 80 is of SQL Server 2000 so it will not work. If you do not have any problem then you may easily update compatibility level to 90 for upgraded databases.


To change compatibility level of a database to 90 (SQL Server 2005) use following script


ALTER DATABASE AdventureWorks
SET SINGLE_USER
GO

EXEC sp_dbcmptlevel AdventureWorks, 90;
GO

ALTER DATABASE AdventureWorks
SET MULTI_USER
GO


Through SSMS you may change the compatibility level by right clicking the database and click properties. Following screen will appear and you may change the level there.




By the way compatibility levels of SQL Server databases are as followings

* 60 = SQL Server 6.0

* 65 = SQL Server 6.5

* 70 = SQL Server 7.0

* 80 = SQL Server 2000

* 90 = SQL Server 2005

Wednesday, August 6, 2008

Usefull T-SQL functions for using in scripts

Here are some use full T-SQL functions which may be help full during several operations.
Change comptibility level of DB
EXEC sp_dbcmptlevel AdventureWorks, 90

To get Name of SQL Server instance
SELECT @@SERVERNAME

To get name of sql server service
SELECT @@SERVICENAME

To get info about logged in user
SELECT @@SPID AS 'ID', SYSTEM_USER AS 'Login Name', USER AS 'User Name'

Check disk allocation for a database
DBCC CHECKALLOC

Check number of connections/attempted connections since sql server started
select @@CONNECTIONS

Display info about a db
sp_helpdb pubs

Display info about a table
sp_help tablename

get list of all user databases
sp_databases

To Get Name of ur current working DB
select db_name()

Retreive character map
exec sp_helpsort

These are all building blocks and will help you to get valuable information while creating triggers, logs, SPs etc.

Set SQL Server database to single user mode or multi user mode

Often we require to set the database in to single user mode. And after specified operation we switch to multi user mode. Here is a script to perform the switch operation.
For changing AdventureWorks database to single user mode
ALTER DATABASE adventureworks
SET SINGLE_USER
GO

and for again to multi user mode
ALTER DATABASE adventureworks
SET MULTI_USER
GO

Also you have to consider that if some users are connected to that database then their connections will be dropped without any warning. And you can not set the system databases master, tempDB, msDB to single user mode. For that purpose you have to start your SQL Server in single user mode.
You can also set a database to single user mode through GUI. For SSMS i will go through for steps
  • Right click on database to be set user mode
  • Go to Options in left panel
  • In right panel you will found 'Restrict Access' option
  • Choose single user mode
  • Click OK
Same procedure would be repeated to set the database back to multiple user mode

To get data and log files info in SQL Server

Here is a script to get information of all your database and log files.

SELECT DB_NAME(dbid) AS DatabaseName,
fileid, [filename] AS FullPath,
convert(varchar(20),SIZE)+' '+'8KB Pages' AS FileSize,
name AS PhysicalFileName
FROM sys.sysaltfiles
ORDER BY databaseName, fileID
GO


Some thing good about cursors performance

Mostly cursors are notorious for performance degradation and resource consumption. And are used as last option always. But there is situation when cursors may help you to boost the performance. If your application or procedure requires the repeated use of a set of records,it is faster to create a cursor once and reuse it several times than to repeatedly query the database. But do not forget to properly close the cursor at end and also deallocate the resources.
Follow these steps to create, use, and close a database cursor:

  • Create the cursor
  • Open the cursor for use within the procedure or application
  • Fetch a record's data one row at a time until you have reached the end of the cursor's records
  • Close the cursor when you are finished with it
  • Deallocate the cursor to completely discard it

Consider the following example of cursor created for pubs..authors
DECLARE @fName VARCHAR(40)
DECLARE @lName VARCHAR(40)
DECLARE @city VARCHAR(40)

DECLARE OAKLAND CURSOR FOR
SELECT au_lname, au_fname, city
FROM pubs..authors
OPEN OAKLAND

FETCH NEXT FROM OAKLAND INTO @fName, @lName, @city

WHILE (@@FETCH_STATUS <> -1)
BEGIN
FETCH NEXT FROM OAKLAND INTO @fName, @lName, @city
IF (@city = 'OAKLAND')
BEGIN
PRINT @city
END
END
GO
CLOSE OAKLAND

DEALLOCATE OAKLAND

GO

You may find more detail use of cursors in online books.

Search for a column in all tables of a database

Today i read an interesting question on a blog, that how to search for a column in all tables of a database. Following script was given by author to search for columns with name like employeeID in adventureworks database.

USE AdventureWorks
GO

SELECT t.name AS table_name,
SCHEMA_NAME(SCHEMA_ID) AS SCHEMA_NAME,
c.name AS column_name
FROM sys.tables AS t
INNER JOIN sys.columns c ON t.OBJECT_ID = c.OBJECT_ID
WHERE c.name LIKE '%EmployeeID%'
ORDER BY SCHEMA_NAME, table_name



It served the purpose but i have a more simple approach for same purpose. My script generates same result in a more simple and flexible way

USE AdventureWorks
GO


SELECT Table_Schema, Table_Name, Column_Name, Data_Type
FROM information_schema.columns
WHERE table_name in ( select name from adventureworks..sysobjects
where xtype = 'U' )
and column_name like '%EmployeeID%'
order by table_schema, table_name


So both may be used as convenient.

Saturday, August 2, 2008

Severity levels of error messages in SQL Server 2000

The severity level of an error message provides an indication of the type of problem that Microsoft® SQL Server™ 2000 has encountered. You can see all system messages by running following simple query

SELECT *
FROM master.dbo.sysmessages

Level 0 and 10

Severity level 0 and 10 messages are informational and indicate problems caused by mistakes in the information you have entered.


Level 11 - 16
Severity levels from 11 through 16 are generated by the user, and can be corrected by the user.


Level 17 - 25
Severity levels from 17 through 25 indicate software or hardware errors. You should inform the system administrator whenever problems that generate errors with severity levels 17 and higher occur. The system administrator must resolve these errors and track their frequency. When a level 17, 18, or 19 error occurs, you can continue working, although you might not be able to execute a particular statement. The system administrator should monitor all problems that generate severity levels from 17 through 25 and print the error log that contains information to backtrack from the error. If the problem affects an entire database, you can use DBCC CHECKDB (database ) to determine the extent of the damage. DBCC may identify some objects that must be removed and will optionally repair the damage. If damage is extensive, the database might have to be restored.


For user defined error messages
When specifying user-defined error messages with RAISERROR, use error message numbers greater than 50,000 and severity levels from 0 through 18. Only system administrators can issue RAISERROR with a severity level from 19 through 25.


For more detailed information you can go to following links

http://msdn.microsoft.com/en-us/library/aa937483(SQL.80).aspx

http://blog.sqlauthority.com/2007/04/25/sql-server-error-messages-sysmessages-error-severity-level/