Jun 142016
 

The host for T-SQL Tuesday #79 is Michael J. Swart (b|t), and his topic of choice is to write about the new release of SQL Server 2016.

This past weekend I attended the SQL Server 2016 Launch Discovery Day in Raleigh, NC. I have attended several SQL Server launch events over the years, but this one was quite different. While it wasn’t a real launch event, it was marketed as a hands-on event to learn about the newest release of SQL Server. The hands-on part for everyone to breakup into teams of five to solve a pre-determined problem. Basically, this was hackathon for SQL Server, and something I’ve never done before.

We started early in the morning with a few presentations about SQL Server 2016. Kevin Feasel (b|t) spoke about the new features, and SQL Server MVP Rick Heiges (b|t) spoke about a real-world win using columnstore indexes in SQL Server 2016. Just before our lunch break the hack was revealed; the basics of which were simple.

First, you are given a set of data set; just five tables containing the following information.

  • Virtual Chapter membership (with member location)
  • SQL Saturday Registration Data and session/track
  • PASS Membership Data
  • Multi-year Summit Data including title, track, score etc
  • Multi-year Summit Registration data including geographic location of attendee

Then you must design a solution to answer these questions.

  • From how far away do attendees travel to SQL Saturday? Are there any geographic trends to the distances traveled?
  • Does the SQL Saturday session data provide any insight into what sessions are chosen for Summit? Are there any trends in session or topic content that can be established?
  • Are there are geographical insights that can discerned? Do the sessions presented at SQL Saturday help predict popularity or selection of sessions at Summit?
  • Does virtual chapter member data provide any insights into PASS membership growth?

The judges score each solution based on the following criteria.

  • Use of new features in SQL Server 2016
  • Usefulness of the dashboard in Power BI/visualizations for the community
  • Completeness of the solution
  • Innovativeness of solution
  • Bonus points for mobile friendliness

Your solution must be completed in 3 hours.

On paper this all sounds pretty easy, but in practice it was quite hard. I am no BI developer and the other members of my team did not have any expertise in that area either, but we still managed to create a solution and have fun doing so.

The first issue was had was how to combine our development work on the same database. This one was easy…just use Azure. In the span of about 30 minutes, I spun up a new Azure VM with SQL Server 2016 pre-installed, uploaded the database, setup logins, and opened the appropriate ports. I then gave my team members the URL and credentials so they each could connect from their laptops.

One of my team members, Dhruv, wanted to get SQL Server R Services installed to analyze the data set. Machine learning was his specialty, and since R Services is a new feature for SQL Server 2016, we thought this would be a good place to start. However, this proved to be mistake for our team. We spent way too much time trying to get it setup. This was mainly do to the the need to install the R components without an internet connection, or I should say a slow connection. I wish we could have implemented this, because Dhruv had some really good ideas. Even without R Services, he was able to create some nifty reports using Power BI.

One of my other team members, Mike, spent a lot of time trying to understand the data, and how to query it to produce the data for our reports. I’m not sure if this was by design, but let me just say the data set was horrible and every team complained about the time needed to clean it up to the point of being useful. Either way, it was just one of the many problems that we needed to overcome. Most of Mike’s code was used in the Power BI dashboard that Dhruv created, but he was also able to write some code that made some good points about membership growth potential; however, we did not have time to build a report for it.

Our team (Team Tiger) finished our solution and presented to the group, but it was clear from the other presentations that we had been over matched. The winning solution was from Team Cheetah who had some unique insights into the data and designed their reports to reflect that detail. Not to mention, their presentation went into a lot of detail about what they had learned.

I really liked the entire event, but I wish that we had more time to work on the solution. Three hours seems like a lot, but after thinking about the challenge for a few days, there were so many more ideas that I came up with. For starters, one of the areas to score points was the use of new SQL Server 2016 features. That basically translates into ANY new feature whether it helps you produce a better looking report or not. With that in mind, I could have done the following.

  • Enable the Query Store to capture performance metrics from our solution.
  • Enabled Row-Level Security so we could the reports show different data based on which user is running it.
  • Spin up additional VMs in different Azure datacenters so I could create a load-balanced Availability Group that would provide high availability and better response time for users closer to each datacenter.
  • Setup Stretch Database for a single table to store older data in Azure.

While none of these things would have improved our presentation using Power BI, they are tasks that could have been easily implemented by me given my skillset. And by implementing them it would have definitely scored us a lot more points for the use of new SQL Server 2016 features. This is the big lesion that I learned from the event…always play to your strengths. Don’t try to learn a new skill in a few hours, just use the ones you already have. It will be a much better use of your time, and will most likely produce a better end result.

As I said, this was my first hackathon of any kind, but now I can’t wait to attend another one; especially one that deals with SQL Server.

Oct 082013
 

The host for T-SQL Tuesday #47 is Kendal Van Dyke (blog|twitter), and his topic of choice is about the best SQL Server SWAG we ever received at a conference; specifically, the “good stuff”.

I’ve been doing a lot of work with SQL Server over the years, but I’ve only had the opportunity to attend the big conferences a few times. As a matter of fact, next week will be my first time attending the SQL PASS Summit. We’re supposed to talk about the “good stuff” and not any of the “cheap tchotchkes” that are given away by the vendors, but I feel that I really have to include both.

 

First, I’d like to talk about a piece of swag that I received while at the SQL Server Connection conference in Las Vegas in November 2007. This wasn’t my first trip to Las Vegas, but it was my first conference in there. And to make it better, one of my best friends from college was attending the same conference. So you could only imagine the fun I had while in Las Vegas with “college buddy”. In the vendor area, one of the representatives from Microsoft was handing out koozies with SQL Server 2008 printed on the side. These were not your normal koozies. They were slap koozies!

I actually own two other slap koozies, but this one was definitely going to be my new favorite. Like I said, it’s cheap, but I love it, and it’s great conversation starter.

Now let’s talk about the good stuff.

The date was November 7, 2005.  The location was the Moscone Center in San Francisco, CA. The event was the launch party for SQL Server 2005, Visual Studio 2005, .NET Framework 2.0, and BizTalk Server 2006. Microsoft had just spent years rewriting SQL Server, and now they were throwing this elaborate party to celebrate the release. Unlike a week-long conference, this one-day event was completely free. I was living in San Francisco at the time, so it made it really easy to get to this event. All I had to do was hop on my scooter and head downtown. Microsoft didn’t disappoint for their launch party. The event boasted some big headliners. Microsoft CEO Steve Ballmer gave the keynote speech.

The live entertainment was also exciting.  The headliner band was Cheap Trick.  Although not at the height of their popularity, they are a talented rock band in any day.

There was also another all girl cover band that played nothing but AC/DC music.  They were called AC/DShe. Quite a catchy name.

The other highlight of the night was the presence of Paul Teutul, Sr. from the Orange County Choppers show on the Discovery Channel. Not only was Paul Sr. there hanging out in the crowd taking pictures with the attendees, but his team build a chopper for Microsoft with the SQL Server logo on it.

 

So finally to the swag. Each attendee was given a free copy of SQL Server 2005 Standard Edition and Visual Studio 2005 Professional Edition. Most software given away at conferences are evaluation or time-bombed copy, but these were fully licensed copies. In 2005, this was probably $1000 worth of software that was now mine.

It may sound anticlimactic, but for a guy on a shoe-string budget, living in one of the most expensive cities in the country, this was definitely the best swag I’ve ever received.

Mar 122013
 

T-SQL Tuesday #40 is underway, and this month’s host is Jennifer McCown (blog|twitter).  The topic is about File and Filegroup Wisdom.  Jennifer says she’s a big fan of the basics, so I thought I would talk about the basics of proportional fill within a filegroup.  This should be pretty common knowledge, but I still talk to a lot of DBAs that don’t know anything  about it, or if they have heard of it, they still don’t know how it works.
The proportional fill algorithm is used to keep the amount of free space within a filegroup evenly distributed across all files in a filegroup.  SQL Server’s proportional fill falls in line with the strategy of placing your files and filegroups across multiple disks, and thus, allowing for improved I/O performance.
Let’s say we need to add more storage space for our AdventureWorks2012 database that has outgrown the current drive D.  Because of storage limitations, we can’t add any more space to D, so our only choice is to add a completely new drive E.  
Once we add the new E drive to the server, we add a new data file to the PRIMARY filegroup of the AdventureWorks2012database using the following query.
USE master;
GO
ALTER DATABASE AdventureWorks2012
ADD FILE (
     NAME = N’AdventureWorks2012_Data2′
    ,FILENAME = N’E:MSSQL11.TEST1MSSQLDATAAdventureWorks2012_Data2.ndf’
    ,SIZE = 200MB
    ,FILEGROWTH = 1024KB
) TO FILEGROUP [PRIMARY];
GO

One might think we’re safe at this point; however, because of the proportional fill feature we’re not.  Once new data is written to the data files, SQL Server will create the new page allocations on the newly created AdventureWorks2012_Data2.ndf file because it has a higher percentage of free space compared to AdventureWorks2012_Data.mdf.  Drive E now suddenly becomes a new I/O hotspot on the server.
You can check the space used with the following query.
USE AdventureWorks2012;
GO
SELECT
     name AS ‘LogicalName’
    ,physical_name AS ‘PhysicalName’
    ,CONVERT(INT,ROUND(size/128,0)) AS ‘Size (MB)’
    ,CONVERT(INT,ROUND(FILEPROPERTY(name,‘SpaceUsed’)/128,0)) AS ‘SpaceUsed (MB)’
FROM sys.database_files
WHERE type = 0;
GO

To avoid this disk hotspot issue, we need to have the data more evenly balanced across both files in the filegroup in terms of data page allocations.  The quickest way to do this is to rebuild all of the clustered indexes within the database.
ALTER INDEX [PK_AWBuildVersion_SystemInformationID] ON [dbo].[AWBuildVersion] REBUILD;
ALTER INDEX [PK_ErrorLog_ErrorLogID] ON[dbo].[ErrorLog] REBUILD;
ALTER INDEX [PK_Department_DepartmentID] ON[HumanResources].[Department] REBUILD;
:
:
ALTER INDEX [PK_Store_BusinessEntityID] ON[Sales].[Store] REBUILD;
GO

SQL Server will do its best to automatically rebalance all of the page allocations across all files within the same filegroup.  In our case, both data files are still part of the PRIMARY filegroup. 
Check the space used again with the following query.
USE AdventureWorks2012;
GO
SELECT
     name AS ‘LogicalName’
    ,physical_name AS ‘PhysicalName’
    ,CONVERT(INT,ROUND(size/128,0)) AS ‘Size (MB)’
    ,CONVERT(INT,ROUND(FILEPROPERTY(name,‘SpaceUsed’)/128,0)) AS ‘SpaceUsed (MB)’
FROM sys.database_files
WHERE type = 0;
GO

Now what we have is much more evenly balanced allocation across both data files.  This will allow SQL Server to even distribute the write I/O across both disk drives.
By doing this one index maintenance step after adding a new file, you’ll help prevent a write hotspot on one of your disks and help SQL Server improve its I/O performance.  But keep in mind that proportional fill only affects all files in the same filegroup.  If we had added the second file to a new filegroup, then we would have to manually move tables to the new filegroup.
For more info on files and filegroups, check out BooksOnline.
Feb 122013
 

T-SQL Tuesday – This month’s party is hosted by Wayne Sheffield (blog|twitter), and the topic is about Powershell and how to use it for anything SQL Server.

With that challenge, I’d like to share a script I’ve written that takes a backup file from one server, copies to another server, and and then restores it.  That may sound pretty easy, but I’ve added in a few requirements to the restore.




Here’s the scenario:
We have two SQL Servers, one production (TRON2R2PROD) and one test (TRON3R2TEST), and we have one user database (AdventureWorks2008R2) on each of the production and test servers.  The test server is used by a developer.  The developer send us a request to “refresh the development database with a copy of production“.  This translates into: he needs the most recent backup of that production database copied from the production server over to the test server, then restored to it by overwriting the existing database, all while preserving his existing dbo level permissions.
The manual approach to completing this task.
  1. Figure out which full database backup file is the most recent for AdventureWorks2008R2.
  2. Copy the file from TRON2 to TRON3.
  3. On TRON3R2TEST, script out all existing user permissions for the AdventureWorks2008R2 database.
  4. Restore the backup.
  5. Run the script from step 3 to reapply the developers permissions.
  6. Delete the backup file from TRON3.

Total time to execute this task manually: ~ 10 minutes.

That many not seem like much time out of your entire workday, but what if that same developer wants you to complete this task each morning at 8AM.  Now you’re up to 10 minutes per day.  And what if he asked you to do it several times a day, every day of the week.  That 10 minutes can really add up.
The Powershell approach to completing this task.
  1. Run the AutoDatabaseRefresh.ps1 script.
Total time to execute this task using Powershell: < 30 seconds.

How’s that for performance improvement?

The great thing about Powershell is that it allows you to connect to different systems, such as Windows and SQL Server, all from a single programming language.  The entire script is written using the SQL Management Objects (SMO).  It does not use any of the SQL Server cmdlets, so there are no modules to import.  Let’s take a closer look.
For this script you need to pass 6 parameters to this script.
  1. $sourceInstance – Source SQL Server name
    • Example: “TRON2R2PROD
  2. $sourceDbName – Source database
    • Example: “AdventureWorks2008R2
  3. $sourcePath – Source share where the file exists (UNC Path)
    • Example: “\TRON2BACKUPR2PRODAdventureWorks2008R2
  4. $destinationInstance – Destination SQL Server name
    • Example: “TRON3R2TEST
  5. $destinationDbName – Database to be refreshed on destination server
    • Example: “AdventureWorks2008R2
  6. $destinationPath – Destination share to copy backup file to (UNC Path)
    • Example: “\TRON3BACKUP
The script needs to know both the source and destination SQL Servers (#1 and #4), and the source and destination database names (#2 and #5).  The other two parameters are the source paths (#3 and #6) and they must be UNC file shares.  This is so the Powershell script can be executed from any server or from any DBA’s workstation.
The basic workflow of the Powershell script is as follows:
Step 1:  Validate the input parameters.  All connectivity to the SQL Servers and to the file shares use Windows Authentication.  Tests for blank parameters.  Tests the connectivity to each SQL Server. Test that each file share exists.  If any of these validation tests fail, the script will halt.
if([String]::IsNullOrEmpty($sourceInstance))
{
    Write-Host “ERROR”
    $errorMessage = “Source server name is not valid.”
    throw $errorMessage

}

Step 2:  Connect to $sourceInstance to get the name of the most recent backup file for $sourceDbName.  This is accomplished by running this TSQL script.
$server = GetServer($serverInstance)
$db = $server.Databases[“msdb”]
$fileList = $db.ExecuteWithResults(
    @”
DECLARE
     @BackupId int
    ,@DatabaseName nvarchar(255);
SET @DatabaseName = ‘$sourceDbName’;
— Get the most recent full backup for this database
SELECT TOP 1
     @DatabaseName ASDatabaseName
    ,m.physical_device_name
    ,RIGHT(m.physical_device_name,CHARINDEX(,REVERSE(physical_device_name),1) 1) AS ‘FileName’
    ,b.backup_finish_date
    ,b.type AS ‘BackupType’
FROM msdb.dbo.backupset b JOIN msdb.dbo.backupmediafamily m
ON b.media_set_id = m.media_set_id
WHERE b.database_name =@DatabaseName
    AND b.type = ‘D’
    AND b.is_snapshot = 0
    AND b.is_copy_only = 0
    AND b.backup_finish_date ISNOT NULL
ORDER BY b.database_backup_lsn DESC;
    “@
This give us the following output.

Step 3:  Copy the file from $sourcePath to $destinationPath.  From the output above, the physical file, AdventureWorks2008R2_db_201302060836.BAK, is located in D:BackupR2PRODAdventureWorks2008R2, so the $sourcePath must match this location.  Our UNC path is \TRON2BACKUPR2PRODAdventureWorks2008R2.  This step uses the Copy-Item cmdlet.  In my testing I have seen this cmdlet outperform the regular Windows copy and even Robocopy.
$source = $sourcePath + “” + $backupFile
Write-Host “Copying file…”
copy-item $source -destination $destinationpPath
Step 4:  Connect to $destinationInstance and script out all user-level permissions and database roles for the $destinationDbName.  The is accomplished by using the following script.
$server = GetServer($serverInstance)
$db = $server.Databases[“$destinationDbName”]
if(-not $db)
{
    Write-Host “Database does not exist on: $serverInstance”
}
else
{
    Write-Host “Saving permissions on $destinationDbName…” -NoNewline
    $commandList = $db.ExecuteWithResults(
        @”
IF OBJECT_ID(‘tempdb..#Commands’) IS NOT NULL
    DROP TABLE #Commands;
CREATE TABLE #Commands(
     RowId int identity(1,1)
    ,Cmd varchar(2000));
INSERT #Commands(Cmd)
SELECT ‘USE [$destinationDbName];IF EXISTS (SELECT * FROM sys.database_principals WHERE name = N’+QUOTENAME(d.name,CHAR(39))+‘) ALTER USER ‘ + QUOTENAME(d.name) + ‘ WITH LOGIN = ‘ + QUOTENAME(s.name) + ‘;’
FROM[$destinationDbName].sys.database_principalsd LEFT OUTER JOIN master.sys.server_principals s
    ON d.sid = s.sid
WHERE s.name IS NOT NULL
    AND d.type = ‘S’
    AND d.name <> ‘dbo’;
INSERT #Commands(Cmd)
SELECT ‘USE [$destinationDbName];IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N’+QUOTENAME(d.name,CHAR(39))+‘) CREATE USER ‘ + QUOTENAME(d.name) + ‘ FOR LOGIN ‘ + QUOTENAME(s.name) + ‘ WITH DEFAULT_SCHEMA = ‘+ QUOTENAME(d.default_schema_name) + ‘;’
FROM[$destinationDbName].sys.database_principalsd LEFT OUTER JOIN master.sys.server_principals s
    ON d.sid = s.sid
WHERE s.name IS NOT NULL
    AND d.type = ‘S’
    AND d.name <> ‘dbo’;
INSERT #Commands(Cmd)
SELECT ‘USE [$destinationDbName];IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N’+QUOTENAME(d.name,CHAR(39))+‘) CREATE USER ‘ + QUOTENAME(d.name) + ‘ FOR LOGIN ‘ + QUOTENAME(s.name) + ‘;’
FROM[$destinationDbName].sys.database_principalsd LEFT OUTER JOIN master.sys.server_principals s
    ON d.sid = s.sid
WHERE s.name IS NOT NULL
    AND d.type IN (‘U’,‘G’);
INSERT #Commands(Cmd)
SELECT ‘USE [$destinationDbName];IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N’+QUOTENAME(p.name,CHAR(39))+‘) CREATE ROLE ‘ + QUOTENAME(p.name) + ‘ AUTHORIZATION ‘+QUOTENAME(o.name)+‘;’
FROM[$destinationDbName].sys.database_principalsp JOIN [$destinationDbName].sys.database_principalso
    ON o.principal_id = p.owning_principal_id
WHERE p.type = ‘R’
    AND p.is_fixed_role = 0
    AND p.principal_id <>0;
INSERT #Commands(Cmd)
SELECT ‘USE [$destinationDbName];EXEC sp_addrolemember N’ + QUOTENAME(d.name,””) + ‘, N’ + QUOTENAME(m.name,CHAR(39)) + ‘;’
FROM[$destinationDbName].sys.database_role_membersr JOIN [$destinationDbName].sys.database_principalsd
    ON r.role_principal_id =d.principal_id JOIN[$destinationDbName].sys.database_principalsm
    ON r.member_principal_id =m.principal_id
WHERE m.principal_id > 5;
INSERT #Commands(Cmd)
SELECT ‘USE [$destinationDbName];’ +dp.state_desc +‘ ‘ + dp.permission_name + ‘ TO ‘ + QUOTENAME(d.name) COLLATE Latin1_General_CI_AS +‘;’
FROM[$destinationDbName].sys.database_permissionsdp JOIN [$destinationDbName].sys.database_principalsd
    ON dp.grantee_principal_id =d.principal_id
WHERE dp.major_id = 0
    AND dp.state <> ‘W’
    AND dp.permission_name <>‘CONNECT’
ORDER BY d.name, dp.permission_name ASC, dp.state_desc ASC;
INSERT #Commands(Cmd)
SELECT ‘USE [$destinationDbName];GRANT ‘ + dp.permission_name + ‘ TO ‘ + QUOTENAME(d.name) COLLATELatin1_General_CI_AS + ‘ WITH GRANT OPTION;’
FROM[$destinationDbName].sys.database_permissionsdp JOIN [$destinationDbName].sys.database_principalsd
    ON dp.grantee_principal_id =d.principal_id
WHERE dp.major_id = 0
    AND dp.state = ‘W’
    AND dp.permission_name <>‘CONNECT’
ORDER BY d.name, dp.permission_name ASC, dp.state_desc ASC;
SELECT Cmd FROM #Commands
ORDER BY RowId;
        “@
}
This gives us the existing permissions that we’ll re-apply later in step 6.  You can see we’re creating code to resync logins (ALTER USER…WITH LOGIN), create the user if it doesn’t exist, create database roles if they don’t exist, and add users to those database roles.

Step 5:  Restore the backup file to $destinationInstance using the $destinationDbName name.  This is the real meat and potatoes of the script.  
$restore = new-object (‘Microsoft.SqlServer.Management.Smo.Restore’)
$restore.Database = $destinationDbName
$restore.NoRecovery = $false
$restore.PercentCompleteNotification = 10
$restore.Devices.AddDevice($backupDataFile, [Microsoft.SqlServer.Management.Smo.DeviceType]::File)

First it checks $destinationInstance to see if $destinationDbName already exists.  If it does, then it just restores over it.  If $destinationDbName does not exist, then the script will create it using the RESTORE…WITH MOVEcommand.  Since the source and destination SQL Servers have different instance names, the file folders for the physical MDF & LDF files will be different.  The script uses the default folder locations to store the data and log files.  This folders were specified when you installed SQL Server.  If the $sourceDbName has several NDF files, all of them will be placed in the default data folder.
$defaultMdf = $server.Settings.DefaultFile
$defaultLdf =$server.Settings.DefaultLog

Before the restore, the script will set the recovery mode of $destinationDbName to SIMPLE.  This is avoid the “backup tail log” error message in case the database is in FULL recovery mode.  It sets the database to single-user mode to kill any existing connections before the restore.  And after the restore is complete, it sets the recovery mode back to SIMPLE.
$db.RecoveryModel =[Microsoft.SqlServer.Management.Smo.RecoveryModel]::Simple
$db.UserAccess = “Single”
$db.Alter(
    [Microsoft.SqlServer.Management.Smo.TerminationClause]
    “RollbackTransactionsImmediately”)
Step 6:  Apply the saved permissions from step 4 to $destinationDbName.  These are the permissions that were scripted out from step 4.  They are applied to the $destinationDbName one line at a time.
foreach($Row in $commandList.Tables[0].Rows)
{
    $db.ExecuteNonQuery($Row[“Cmd”])
}

Step 7:  Delete the backup from $destinationPath.  This is the cleanup step.
remove-item$backupFile

When running the script from a console, the output will look like this.
=============================================================
 1: Perform Initial Checks & Validate Input Parameters
=============================================================
Validating parameters…OK
Verifying source SQL Server connectivity…OK
Verifying source database exists…OK
Verifying destination SQL Server connectivity…OK
Verifying source file share exists…OK
Verifying destination file share exists…OK
=============================================================
 2: Get Source Backup for the Restore
=============================================================
Connecting to TRON2R2PROD to find a restore file…
Selected file:  D:BackupR2PRODAdventureWorks2008R2AdventureWorks2008R2_db_201302060836.BAK
Verifying file: \TRON2BACKUPR2PRODAdventureWorks2008R2AdventureWorks2008R2_db_201302060836.BAK exists…
Source file existence: OK
=============================================================
 3: Copy Backup File to the Destination
=============================================================
Copying file…
Copy file: OK
=============================================================
 4: Get Current Permissions on the Destination Database
=============================================================
Saving permissions on AdventureWorks2008R2…OK
=============================================================
 5: Restore Backup File to the Destination Server
=============================================================
Restoring database…
Database Restore: OK
=============================================================
 6: Restore Permissions to the Destination Database
=============================================================
Restoring existing permissions…
Existing permissions restored: OK
=============================================================
 7: Delete Backup File from the Destination Server
=============================================================
Deleting file…
Delete file: OK
=============================================================
    Database refresh completed successfully
=============================================================

The best part about using the Powershell script, is you can setup a SQL Agent job to call the script with the parameters already specified.  That way when the developer asks you refresh the same database then all you have to do is run the job, or you can work the developer to schedule the job to run automatically each day.

The SQL Agent job will need to setup as an “Operating system (CmdExec)” job type.  This is because it uses Powershell components that are outside the normal SQLPS group of commands.
The entire script is below.  Feel free to modify it as you see fit for your environment.
###########################################################################################
#
#   File Name:    AutoDatabaseRefresh.ps1
#
#   Applies to:   SQL Server 2008
#                 SQL Server 2008 R2
#                 SQL Server 2012
#
#   Purpose:      Used to automatically restore a database in another environment.
#
#   Prerequisite: Powershell v2.0 must be installed.
#                 SQL Server components must be installed.
#
#   Parameters:   [string]$sourceInstance - Source SQL Server name (Ex: SERVER\INSTANCE)
#                 [string]$sourceDbName - Source database
#                 [string]$sourcePath - Source share where the file exists
#                 [string]$destinationInstance - Destination SQL Server name (Ex: SERVER\INSTANCE)
#                 [string]$destinationDbName - Database to be refreshed/created on desitination server
#                 [string]$destinationPath - Share to copy backup file to (UNC Path Ex: \\SERVER\backup$)
#
#   Author:       Patrick Keisler
#
#   Version:      1.0.0
#
#   Date:         02/06/2013
#
#   Help:         http://www.patrickkeisler.com/
#
###########################################################################################

#Enable Debug Messages
#$DebugPreference = "Continue"

#Disable Debug Messages
$DebugPreference = "SilentlyContinue"

#Terminate Code on All Errors
$ErrorActionPreference = "Stop"

#Clear screen
CLEAR

#Load Assemblies
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo') | out-null
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SmoExtended') | out-null
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.ConnectionInfo') | out-null

function CheckForErrors {
    $errorsReported = $False
    if($Error.Count -ne 0)
    {
		Write-Host
		Write-Host "******************************"
        Write-Host "Errors:" $Error.Count
        Write-Host "******************************"
        foreach($err in $Error)
        {
            $errorsReported  = $True
            if( $err.Exception.InnerException -ne $null)
            {
                Write-Host $err.Exception.InnerException.ToString()
            }
            else
            {
                Write-Host $err.Exception.ToString()
            }
            Write-Host
        }
        throw;
    }
}
function GetServer {
    Param([string]$serverInstance)

    $server = New-Object ("Microsoft.SqlServer.Management.Smo.Server")($serverInstance)
    $server.ConnectionContext.ApplicationName = "AutoDatabaseRefresh"
	$server.ConnectionContext.ConnectTimeout = 5
    $server;
}
function GetRestoreFileList {
	Param([string]$serverInstance, [string]$sourcePath)

	Write-Host "Connecting to $serverInstance to find a restore file..."
	$server = GetServer($serverInstance)
	$db = $server.Databases["msdb"]
	$fileList = $db.ExecuteWithResults(
	@"
DECLARE
	 @BackupId int
	,@DatabaseName nvarchar(255);

SET	@DatabaseName = '$sourceDbName';
		
-- Get the most recent full backup for this database
SELECT TOP 1
	 @DatabaseName AS DatabaseName
	,m.physical_device_name
	,RIGHT(m.physical_device_name, CHARINDEX('\',REVERSE(physical_device_name),1) - 1) AS 'FileName'
	,b.backup_finish_date
	,b.type AS 'BackupType'
FROM msdb.dbo.backupset b JOIN msdb.dbo.backupmediafamily m
ON b.media_set_id = m.media_set_id
WHERE b.database_name = @DatabaseName
	AND b.type = 'D'
	AND b.is_snapshot = 0
	AND b.is_copy_only = 0
	AND b.backup_finish_date IS NOT NULL
ORDER BY b.database_backup_lsn DESC;
"@
	)

	CheckForErrors
	
	if ($fileList.Tables[0].Rows.Count -ge 1)
	{
		foreach($file in $fileList.Tables[0].Rows)
		{
			$source = $sourcePath + "\" + $file["FileName"]
			Write-Host "Selected file: " $file["physical_device_name"]
			
		    Write-Host "Verifying file: $source exists..."
		    if((Test-Path -Path $source) -ne $True)
		    {
	            $errorMessage = "File:" + $source + " does not exists"
	            throw $errorMessage
		    }
		}

		Write-Host "Source file existence: OK"
		$file["FileName"].ToString();
	}
	else
	{
        $errorMessage = "Source database " + $sourceDbName + " does not have any current full backups."
        throw $errorMessage	
	}
}
function GetExistingPermissions {
    Param([string]$serverInstance, [string]$destinationDbName)

    $server = GetServer($serverInstance)
	$db = $server.Databases["$destinationDbName"]

	if(-not $db)
	{
		Write-Host "Database does not exist on: $serverInstance"
	}
	else
	{
		Write-Host "Saving permissions on $destinationDbName..." -NoNewline
		$commandList = $db.ExecuteWithResults(
		@"
IF OBJECT_ID('tempdb..#Commands') IS NOT NULL
	DROP TABLE #Commands;
CREATE TABLE #Commands(
	 RowId int identity(1,1)
	,Cmd varchar(2000));

INSERT #Commands(Cmd)
SELECT 'USE [$destinationDbName];IF EXISTS (SELECT * FROM sys.database_principals WHERE name = N'+QUOTENAME(d.name,CHAR(39))+') ALTER USER ' + QUOTENAME(d.name) + ' WITH LOGIN = ' + QUOTENAME(s.name) + ';'
FROM [$destinationDbName].sys.database_principals d LEFT OUTER JOIN master.sys.server_principals s
	ON d.sid = s.sid 
WHERE s.name IS NOT NULL
	AND d.type = 'S'
	AND d.name <> 'dbo';

INSERT #Commands(Cmd)
SELECT 'USE [$destinationDbName];IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'+QUOTENAME(d.name,CHAR(39))+') CREATE USER ' + QUOTENAME(d.name) + ' FOR LOGIN ' + QUOTENAME(s.name) + ' WITH DEFAULT_SCHEMA = ' + QUOTENAME(d.default_schema_name) + ';'
FROM [$destinationDbName].sys.database_principals d LEFT OUTER JOIN master.sys.server_principals s
	ON d.sid = s.sid 
WHERE s.name IS NOT NULL
	AND d.type = 'S'
	AND d.name <> 'dbo';

INSERT #Commands(Cmd)
SELECT 'USE [$destinationDbName];IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'+QUOTENAME(d.name,CHAR(39))+') CREATE USER ' + QUOTENAME(d.name) + ' FOR LOGIN ' + QUOTENAME(s.name) + ';'
FROM [$destinationDbName].sys.database_principals d LEFT OUTER JOIN master.sys.server_principals s
	ON d.sid = s.sid 
WHERE s.name IS NOT NULL
	AND d.type IN ('U','G');

INSERT #Commands(Cmd)
SELECT 'USE [$destinationDbName];IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'+QUOTENAME(p.name,CHAR(39))+') CREATE ROLE ' + QUOTENAME(p.name) + ' AUTHORIZATION '+QUOTENAME(o.name)+';'
FROM [$destinationDbName].sys.database_principals p JOIN [$destinationDbName].sys.database_principals o
	ON o.principal_id = p.owning_principal_id
WHERE p.type = 'R' 
	AND p.is_fixed_role = 0 
	AND p.principal_id <> 0;

INSERT #Commands(Cmd)
SELECT 'USE [$destinationDbName];EXEC sp_addrolemember N' + QUOTENAME(d.name,'''') + ', N' + QUOTENAME(m.name,CHAR(39)) + ';'
FROM [$destinationDbName].sys.database_role_members r JOIN [$destinationDbName].sys.database_principals d
	ON r.role_principal_id = d.principal_id JOIN [$destinationDbName].sys.database_principals m
	ON r.member_principal_id = m.principal_id
WHERE m.principal_id > 5;

INSERT #Commands(Cmd)
SELECT 'USE [$destinationDbName];' + dp.state_desc + ' ' + dp.permission_name + ' TO ' + QUOTENAME(d.name) COLLATE Latin1_General_CI_AS + ';'
FROM [$destinationDbName].sys.database_permissions dp JOIN [$destinationDbName].sys.database_principals d
	ON dp.grantee_principal_id = d.principal_id
WHERE dp.major_id = 0 
	AND dp.state <> 'W'
	AND dp.permission_name <> 'CONNECT'
ORDER BY d.name, dp.permission_name ASC, dp.state_desc ASC;

INSERT #Commands(Cmd)
SELECT 'USE [$destinationDbName];GRANT ' + dp.permission_name + ' TO ' + QUOTENAME(d.name) COLLATE Latin1_General_CI_AS + ' WITH GRANT OPTION;'
FROM [$destinationDbName].sys.database_permissions dp JOIN [$destinationDbName].sys.database_principals d
	ON dp.grantee_principal_id = d.principal_id
WHERE dp.major_id = 0 
	AND dp.state = 'W'
	AND dp.permission_name <> 'CONNECT'
ORDER BY d.name, dp.permission_name ASC, dp.state_desc ASC;

SELECT Cmd FROM #Commands
ORDER BY RowId;
"@
		)

	CheckForErrors
	Write-Host "OK"
	}

	$commandList;
}
function CopyFile {
    Param([string]$sourcePath, [string]$backupFile, [string]$destinationpPath)

    $source = $sourcePath + "\" + $backupFile
    
    try
    {
        Write-Host "Copying file..."
		Write-Debug "Copy $source to $destinationpPath"
		copy-item $source -destination $destinationpPath
    }
    catch
    {
        CheckForErrors
    }

	Write-Host "Copy file: OK"
}
function DeleteFile {
    Param([string]$backupFile)

    try
    {
        Write-Host "Deleting file..."
		Write-Debug "Deleting file: $backupFile"
		remove-item $backupFile
    }
    catch
    {
        CheckForErrors
    }
    
    Write-Host "Delete file: OK"
}
function RestoreDatabase {
    Param([string]$serverInstance, [string]$destinationDbName, [string]$backupDataFile, [string]$actionType)

	Write-Host "Restoring database..."

    $server = GetServer($serverInstance)
	$server.ConnectionContext.StatementTimeout = 0
	$db = $server.Databases["$destinationDbName"]

    #Create the restore object and set properties
    $restore = new-object ('Microsoft.SqlServer.Management.Smo.Restore')
    $restore.Database = $destinationDbName
    $restore.NoRecovery = $false
    $restore.PercentCompleteNotification = 10
    $restore.Devices.AddDevice($backupDataFile, [Microsoft.SqlServer.Management.Smo.DeviceType]::File)

	if(-not $db)
	{
		Write-Debug "$destinationDbName does not exist..."
		
        #Grab the default MDF & LDF file locations.
        $defaultMdf = $server.Settings.DefaultFile
    	$defaultLdf = $server.Settings.DefaultLog

        #If the default locations are the same as the master database, 
        #then those values do not get populated and must be pulled from the MasterPath.
        if($defaultMdf.Length -eq 0)
        {
                $defaultMdf = $server.Information.MasterDBPath
    	}
    	if($defaultLdf.Length -eq 0)
        {
                $defaultLdf = $server.Information.MasterDBLogPath
        }
        
        if(-not $defaultMdf.EndsWith("\"))
        {
            $defaultMdf = $defaultMdf + "\"
        }
        if(-not $defaultLdf.EndsWith("\"))
        {
            $defaultLdf = $defaultLdf + "\"
        }
        		
        $restore.ReplaceDatabase = $True

		#Get the database logical file names            
        try
		{
			$logicalNameDT = $restore.ReadFileList($server)
		}
		catch
		{
			CheckForErrors
		}

        $FileType = ""

		Write-Debug "Restoring $destinationDbName to the following physical locations:"

        foreach($Row in $logicalNameDT)
        {
            # Put the file type into a local variable.
            # This will be the variable that we use to find out which file
            # we are working with.
            $FileType = $Row["Type"].ToUpper()

            # If Type = "D", then we are handling the Database File name.
            If($FileType.Equals("D"))
            {
                $dbLogicalName = $Row["LogicalName"]
				
				$targetDbFilePath = $Row["PhysicalName"]
				$position = $targetDbFilePath.LastIndexOf("\") + 1
				$targetDbFilePath = $targetDbFilePath.Substring($position,$targetDbFilePath.Length - $position)
				$targetDbFilePath = $defaultMdf + $targetDbFilePath
				
			    if((Test-Path -Path $targetDbFilePath) -eq $true)
				{
					$targetDbFilePath = $targetDbFilePath -replace $dbLogicalName, $destinationDbName
				}

				#Specify new data files (mdf and ndf)
		        $relocateDataFile = new-object ('Microsoft.SqlServer.Management.Smo.RelocateFile')
		        $relocateDataFile.LogicalFileName = $dbLogicalName            
		        $relocateDataFile.PhysicalFileName = $targetDbFilePath
		        $restore.RelocateFiles.Add($relocateDataFile) | out-null
		
				Write-Debug $relocateDataFile.PhysicalFileName
            }
            # If Type = "L", then we are handling the Log File name.
            elseif($FileType.Equals("L"))
            {
                $logLogicalName = $Row["LogicalName"]
				
				$targetLogFilePath = $Row["PhysicalName"]
				$position = $targetLogFilePath.LastIndexOf("\") + 1
				$targetLogFilePath = $targetLogFilePath.Substring($position,$targetLogFilePath.Length - $position)
				$targetLogFilePath = $defaultLdf + $targetLogFilePath

			    if((Test-Path -Path $targetLogFilePath) -eq $true)
				{
					$tempName = $destinationDbName + "_Log"
					$targetLogFilePath = $targetLogFilePath -replace $logLogicalName, $tempName
				}

				#Specify new log files (ldf)
		        $relocateLogFile  = new-object ('Microsoft.SqlServer.Management.Smo.RelocateFile')
		        $relocateLogFile.LogicalFileName = $logLogicalName            
		        $relocateLogFile.PhysicalFileName = $targetLogFilePath          
    		    $restore.RelocateFiles.Add($relocateLogFile) | out-null
		
				Write-Debug $relocateLogFile.PhysicalFileName
	        }          
        }
	}
	else
	{
		Write-Debug "Overwritting existing database..."
		
		#Set recovery model to simple on destination database before restore
		if($db.RecoveryModel -ne [Microsoft.SqlServer.Management.Smo.RecoveryModel]::Simple)
    	{
            Write-Debug "Changing recovery model to SIMPLE"
            $db.RecoveryModel = [Microsoft.SqlServer.Management.Smo.RecoveryModel]::Simple
			try
			{
            	$db.Alter()
			}
			catch
			{
				CheckForErrors
			}
    	}

		#Set destination database to single user mode to kill any active connections
		$db.UserAccess = "Single"
		try
		{
			$db.Alter([Microsoft.SqlServer.Management.Smo.TerminationClause]"RollbackTransactionsImmediately")
		}
		catch
		{
			CheckForErrors
		}
	}
	
    #Do the restore
	try
	{
    	$restore.SqlRestore($server)
	}
	catch
	{
    	CheckForErrors
	}
	
	#Reload the restored database object
	$db = $server.Databases["$destinationDbName"]
	
	#Set recovery model to simple on destination database after restore
	if($db.RecoveryModel -ne [Microsoft.SqlServer.Management.Smo.RecoveryModel]::Simple)
    {
        Write-Debug "Changing recovery model to SIMPLE"
        $db.RecoveryModel = [Microsoft.SqlServer.Management.Smo.RecoveryModel]::Simple
		try
		{
        	$db.Alter()
		}
		catch
		{
			CheckForErrors
		}
    }
	
    Write-Host $actionType.ToString() "Restore: OK"
}
function RestorePermissions {
    Param([string]$destinationInstance, [string]$destinationDbName, $commandList)

	Write-Host "Restoring existing permissions..."
	$server = GetServer($destinationInstance)
	$db = $server.Databases[$destinationDbName]
	
	foreach($Row in $commandList.Tables[0].Rows)
	{
		#Apply existing permissions back to destination database
		Write-Debug $Row["Cmd"]
		try
		{
			$db.ExecuteNonQuery($Row["Cmd"])
		}
		catch
		{
			CheckForErrors
		}
	}
	
	Write-Host "Existing permissions restored: OK"
}
function PerformValidation {
    Param($sourceInstance, $sourceDbName, $sourcePath, $destinationInstance, $destinationDbName, $destinationPath)
	
	Write-Host "Validating parameters..." -NoNewline
	
	if([String]::IsNullOrEmpty($sourceInstance))
	{
		Write-Host "ERROR"
        $errorMessage = "Source server name is not valid."
        throw $errorMessage
    }
    if([String]::IsNullOrEmpty($sourceDbName))
    {
		Write-Host "ERROR"
        $errorMessage = "Source database name is not valid."
        throw $errorMessage
    }
    if([String]::IsNullOrEmpty($sourcePath))
    {
		Write-Host "ERROR"
        $errorMessage = "Source path is not valid."
        throw $errorMessage
    }
	else
    {
        if(-not $sourcePath.StartsWith("\\"))
        {
			Write-Host "ERROR"
            $errorMessage = "Source path is not valid: " + $sourcePath
            throw $errorMessage
        }
    }
    if([String]::IsNullOrEmpty($destinationInstance))
    {
		Write-Host "ERROR"
        $errorMessage = "Destination server name is not valid."
        throw $errorMessage
    }
    if([String]::IsNullOrEmpty($destinationDbName))
    {
		Write-Host "ERROR"
        $errorMessage = "Destination database name is not valid."
        throw $errorMessage
    }
	if([String]::IsNullOrEmpty($destinationPath))
    {
		Write-Host "ERROR"
        $errorMessage = "Destination path name is not valid."
        throw $errorMessage
    }
	else
    {
        if(-not $destinationPath.StartsWith("\\"))
        {
			Write-Host "ERROR"
            $errorMessage = "Destination path is not valid: " + $destinationPath
            throw $errorMessage
        }
    }
    
	Write-Host "OK"

    Write-Host "Verifying source SQL Server connectivity..." -NoNewline
	$conn = New-Object Microsoft.SqlServer.Management.Common.ServerConnection($sourceInstance)
    $conn.ApplicationName = "AutoDatabaseRefresh"
	$conn.NonPooledConnection = $true
	$conn.ConnectTimeout = 5
	try
	{
		$conn.Connect()
        $conn.Disconnect()
	}
	catch
	{
		CheckForErrors
	}
    Write-Host "OK"
	
	Write-Host "Verifying source database exists..." -NoNewline
	$sourceServer = GetServer($sourceInstance)
    $sourcedb = $sourceServer.Databases[$sourceDbName]
	if(-not $sourcedb)
    {
		Write-Host "ERROR"
        $errorMessage = "Source database does not exist on $sourceInstance"
        throw $errorMessage
    }
    Write-Host "OK"

    Write-Host "Verifying destination SQL Server connectivity..." -NoNewline
	$conn = New-Object Microsoft.SqlServer.Management.Common.ServerConnection($destinationInstance)
    $conn.ApplicationName = "AutoDatabaseRefresh"
	$conn.NonPooledConnection = $true
	$conn.ConnectTimeout = 5
	try
	{
		$conn.Connect()
        $conn.Disconnect()
	}
	catch
	{
		CheckForErrors
	}
	Write-Host "OK"
	
    Write-Host "Verifying source file share exists..." -NoNewline
    if((Test-Path -Path $sourcePath) -ne $True)
    {
		Write-Host "ERROR"
        $errorMessage = "File share:" + $sourcePath + " does not exists"
        throw $errorMessage
    }
    Write-Host "OK"

    Write-Host "Verifying destination file share exists..." -NoNewline
    if((Test-Path -Path $destinationPath) -ne $True)
    {
		Write-Host "ERROR"
        $errorMessage = "File share:" + $destinationPath + " does not exists"
        throw $errorMessage
    }
    Write-Host "OK"
}

function Main{
	
	Param([string]$sourceInstance, [string]$sourceDbName, [string]$sourcePath, [string]$destinationInstance, [string]$destinationDbName, [string]$destinationPath)
	
	$Error.Clear()

    Write-Host
    Write-Host "============================================================="
    Write-Host " 1: Perform Initial Checks & Validate Input Parameters"
    Write-Host "============================================================="
	
	PerformValidation $sourceInstance $sourceDbName $sourcePath $destinationInstance $destinationDbName $destinationPath
	
    Write-Host
    Write-Host "============================================================="
    Write-Host " 2: Get Source Backup for the Restore"
    Write-Host "============================================================="

	$restoreFile = GetRestoreFileList $sourceInstance $sourcePath
		
    Write-Host
    Write-Host "============================================================="
    Write-Host " 3: Copy Backup File to the Destination"
    Write-Host "============================================================="
	
	CopyFile $sourcePath $restoreFile $destinationPath

    Write-Host
    Write-Host "============================================================="
    Write-Host " 4: Get Current Permissions on the Destination Database"
    Write-Host "============================================================="

	$existingPermissions = GetExistingPermissions $destinationInstance $destinationDbName
	
    Write-Host
    Write-Host "============================================================="
    Write-Host " 5: Restore Backup File to the Destination Server"
    Write-Host "============================================================="

	$restoreFile = $destinationPath + "\" + $restoreFile
	RestoreDatabase $destinationInstance $destinationDbName $restoreFile "Database"
		
    Write-Host
    Write-Host "============================================================="
    Write-Host " 6: Restore Permissions to the Destination Database"
    Write-Host "============================================================="

	if($existingPermissions)
	{
		RestorePermissions $destinationInstance $destinationDbName $existingPermissions
	}
	
    Write-Host
    Write-Host "============================================================="
    Write-Host " 7: Delete Backup File from the Destination Server"
    Write-Host "============================================================="

	DeleteFile $restoreFile
	
    Write-Host
    Write-Host "============================================================="
    Write-Host "    Database refresh completed successfully"
    Write-Host "============================================================="
}

#Hard-coded values used only for development
#$sourceInstance = "TRON2\R2PROD"
#$sourceDbName = "AdventureWorks2008R2"
#$sourcePath = "\\TRON2\BACKUP\R2PROD\AdventureWorks2008R2"
#$destinationInstance = "TRON3\R2TEST"
#$destinationDbName = "AdventureWorks2008R2"
#$destinationPath = "\\TRON3\BACKUP"

#Prompt for inputs for an interactive script
#$sourceInstance = $(Read-Host "Source SQL Server name (Ex: SERVER\INSTANCE)")
#$sourceDbName = $(Read-Host "Source database")
#$sourcePath = $(Read-Host "Source share where the file exists (UNC Path Ex: \\SERVER\BACKUP)")
#$destinationInstance = $(Read-Host "Destination SQL Server name (Ex: SERVER\INSTANCE)")
#$destinationDbName = $(Read-Host "Database to be refreshed/created on desitination server")
#$destinationPath = $(Read-Host "Destination share to copy backup file to (UNC Path Ex: \\SERVER\BACKUP)")

#Capture inputs from the command line.
$sourceInstance = $args[0]
$sourceDbName = $args[1]
$sourcePath = $args[2]
$destinationInstance = $args[3]
$destinationDbName = $args[4]
$destinationPath = $args[5]

$debug = "Source Instance Parameter: " + $sourceInstance
Write-Debug $debug
$debug = "Source Database Parameter: " + $sourceDbName
Write-Debug $debug
$debug = "Source Path Parameter: " + $sourcePath
Write-Debug $debug
$debug = "Destination Instance Parameter: " + $destinationInstance
Write-Debug $debug
$debug = "Destination Database Parameter: " + $destinationDbName
Write-Debug $debug
$debug = "Destination Path Parameter: " + $destinationPath
Write-Debug $debug

Main $sourceInstance $sourceDbName $sourcePath $destinationInstance $destinationDbName $destinationPath