SQL Saturday #222 – Another successful endeavor

The second annual SQL Saturday held in Sacramento was another smashing success! With an updated venue and an international speaker list, attendees were given a full day of FREE training in everything SQL Server. Enabled by our sponsors’ generosity, the Sacramento SQL Server users group was able to host a fantastic event.

Gifts for this year's speakers, customized for Sacramento.

Gifts for this year’s speakers, customized for Sacramento.

Led by our user group president Mitch Bottel (B|T), the planning committee took the success of year one and began to navigate through the planning process, beginning in January. The Patrick Hayes Learning Center in Natomas, graciously donated by Sutter Health, was chosen to house the event. With plenty of conference rooms, wi-fi and a large break room, the facility was perfect for a SQL Saturday. We gave each of our platinum sponsors their “own” room, allowing them to host a lunch session and decorate as they saw fit.  Five tracks gave attendees a choice of subjects to choose from.

Sacramento was represented by local speakers Eric Freeman (B|T), Jon Amen (B|T), Angel Abundez (B|T) (we claim Angel even though he’s semi-Bay Area), Dilip Nayak (T), Mitch Bottel and yours truly, Dan Hess (B|T).  This was my first SQL Saturday as a presenter and I was lucky enough to have Steve Jones (B|T) sit in on my session and give me some great feedback.  Prior to the event he was giving me “the business” about how similar my session was to the talk he gave at our first SQL Saturday.  Good thing I didn’t hit his presentation last year!

It was great seeing my friends from last year, even if Cannonball didn’t make it.  Shout out to Mike Fal (B|T) and John Morehouse (B|T).  It was great meeting new friends also…David Klee (B|T) and Jim Murphy (B|T).  I wish I would have spent more time getting to know the other speakers but responsibilities of the planning committee kept me busy during the day.

A special thanks goes out to all of our volunteers, many of whom are a part of our user group.  The event’s success was directly related to your participation.

I’m looking forward to Sacramento’s third annual SQL Saturday next year!

Running .sql files from within the Query Editor using SQLCMD mode

As a DBA, I have a plethora of queries that I run on a consistent basis, commands that give me information about who’s connected to the instance, how mirroring is performing, what types of permissions a user has, etc.  Early in my career I began cataloging these queries.  My first file was called ‘useful_sql.txt’ (I still have it to this day) and it grew longer with each learned action.  I’d open up SSMS, click the folder icon to open the file and cut/paste any commands I needed to a new editor window.  What a pain.  As I grew in the profession, I began creating individual .sql files with a meaningful title for each specific query, but I still had to click the folder, find the file, wait for it to open in SSMS then click to execute the query.  A little less painful, but still not ideal.

Powershell offered some options but meant I had to leave the query editor.

Here was my solution: SQLCMD mode.

Enabling SQLCMD mode in the query editor allows you to run commands just as you would from the SQLCMD prompt.  The command I find most useful is:

:r <filename>

The file is read and the statements executed.  Now, instead of the hassle of opening a file and executing the commands thru a multi-click process, a query can be executed without leaving the query editor and keyboard, e.g.

:r I:/scripts/showSQL.sql

This executes the following query stored in the ‘showSQL.sql’ file that shows me who’s running which statement:

SELECT
   CONVERT(VARCHAR(8000),(SELECT SUBSTRING(sqltext.text,req.statement_start_offset/2,
   CASE
   WHEN req.statement_end_offset = -1 THEN 8000
   ELSE (req.statement_end_offset-req.statement_start_offset)/2
   END))) as 'SQL statement',
   req.blocking_session_id,
   req.session_id,
   req.status,
   req.command,
   DB_NAME(req.database_id) as "db_name",
   sess.login_name as "user name",
   req.start_time,
   req.cpu_time,
   req.total_elapsed_time
FROM
   sys.dm_exec_requests req
CROSS APPLY
   sys.dm_exec_sql_text(sql_handle) AS sqltext
JOIN
   sys.dm_exec_sessions sess
   on req.session_id = sess.session_id
where
   req.session_id > 50
order by
   req.status

 

For the above example, I have a file share that I’ve mapped to the I: drive.  I place all my scripts on that drive in the /scripts directory.  Files will be read relative to the directory that SQLCMD is invoked from; from the query editor this usually means c:\windows\systems32.  While you could place all your .sql scripts there, it’s not a good idea.  I suggest a fully qualified path to a place of your choosing.

SQLCMD mode has to be enabled in the SSMS query editor. It can be done per query editor session or set to open the editor in SQLCMD mode by default:

  • To enable SQLCMD mode for the current query editor session, go to the Query menu and select SQLCMD mode.
  • To enable SQLCMD mode by default, go to Tools > Options. In the left hand pane single click Query Execution then click the textbox By default, open new queries in SQLCMD mode

 

There a few other uses for SQLCMD mode as well as some limitations, here’s the link to BOL if you’d like more information:

http://msdn.microsoft.com/en-us/library/ms174187(v=SQL.105).aspx

Happy scripting!

<Disclaimer: the sample script was not created by me but I do not remember from where I copied it.  If it’s your work, please post a comment and I’ll be happy to give you credit.>

SQL Saturday #222 coming to Sacramento CA on July 27, 2013

The Sacramento SQL Server users group is happy to announce the second annual SQL Saturday in Sacramento to be held on July 27, 2013.  Our schedule boasts five tracks with six sessions in each track, adding up to 30 speakers coming from all parts of the globe.  We will be well represented with local talent, filling 20% of the speaking duties.  In addition to the all-star lineup on Saturday, Kalen Delaney (@sqlqueen) will be doing a pre-conference workshop on Friday, July 26.

Make sure to mark your calendars for these two amazing events!

Click here to register for Kalen’s workshop “Performance Tuning with Indexes: Internals and Best Practices”

Click here to register for SQL Saturday #222

SQL Saturday #144 – The Good, the Bad and the Ugly

Sacramento’s gift to all who spoke at our event

SQL Saturday #144 is over but what a great time it was!  Sacramento’s first SQL Saturday boasted close to 200 attendees and showcased great speakers like Steve Jones (@way0utwest) and Kalen Delaney (@sqlqueen).  We were represented with local talent Mitch Bottel (@SacSQLdude), Helen Norris (@hpnorris), Jon Amen (@dungeongurunet), Eric Freeman (@sqlelf), Tom Keller (@tomkeller02) and Angel Abundez (@angelstreamline)…and there were COWBELLS! 

In November of 2011 members of the Sacramento SQL Server user group banded together to begin planning the event.  Using documentation from sqlsaturday.com, we began the planning with a classic project management tool, the Work Breakdown Structure.  It allowed us to focus our member’s strengths and start a nice project plan.  For the next 9 months, we were able to meet on a monthly basis and stay coordinated through emails and phone calls.

Our first challenge was finding an appropriate venue.  As I was working at CSU, Sacramento at the time, it seemed a logical choice.  As we dug deeper, we found an alternate location in Natomas which appeared to fit our needs perfectly.  Our foundation was set: we could begin planning food, locations for the sponsors, rooms for our speakers, etc. 

We found our location in early 2012 and went along with our plans: gathering sponsors, speakers and marketing the event.  Everything was moving along nicely….too nicely.  Before our monthly meeting in May, I decided to make one more trip to the venue and draw out a floor plan.  As I was putting my ideas on paper, I found our contact and begin showing him my plans.  He quickly stopped me and showed me the error of my thinking: we did not get a “common area” for the sponsors and lunch would have to be confined to each of the rooms.  The rooms were on two different floors and participants could only move back and forth from the rooms, no loitering.  The registration would need to take place in each room, also.  He may as well have punched me in the face.

I brought this information back to the group and we quickly mobilized.  With only 2 months left before the event, we were lucky enough to find the Courtyard Marriott hotel.  They were willing to work into our budget and would supply the workforce to set up and take down.  Coffee in the morning and lunch rounded out the perks.  CHA-CHING!

Back on track, we finalized our speaker list and finished up our marketing campaign.  There were some minor issues the week before the event, but nothing that our crack team of SQL Server ninjas couldn’t handle.

If you didn’t make it, you missed an amazing day.  Here’s the good, the bad and the ugly:

The Good:

Amazing speakers, amazing volunteers and awesome content.  Lunch went well with the hotel providing the man power.  Attendance was perfect as any more would have been uncomfortable.  Registration was a breeze as most had used their SpeedPASS and pre-printed their raffle tickets.  I got to meet a host of new friends (CANNONBALL!) and reconnect with some old ones.  The after party showcased a real live mermaid (no joke!)

The Bad:

Parking was inadequate, even though we were assured by the hotel that there would be enough.  It got a little tight in the hallways in between sessions.  Our signage was lacking in the morning as attendees did not know that there were rooms in the back. 

The Ugly: 

Mike Fal (@mike_fal), Jon Amen (@dungeongurunet) and John Morehouse (@sqlrus)

 

 

 

 

 

 

Please leave a comment and let us know what you thought of the event.  Good feedback will help us make next year even better!  Stay in touch with the Sacramento SQL Server user group thru http://sac.sqlpass.org and http://www.linkedin.com/groups?gid=2825448&trk=hb_side_g.

My first SQL Server presentation – database mirroring w/ Transparent Data Encryption

I’ve finally cut my teeth on a technical, public presentation on SQL Server.

This wasn’t explaining to a bunch of co-workers and managers about the latest features in SQL Server 2012.  It wasn’t sharing my understanding of whatever piece of Scripture we were studying in church.  It wasn’t addressing a group of parents about our team’s achievements over the last season.  This was a group of my peers who potentially knew as much or more about the subject matter and, being in IT, we’re probably not afraid to let me know when I was wrong.

My first mistake: the topic for the presentation was way too large, Database mirroring with Transparent Data Encryption.  I chose the subject because I was familiar with the technical aspects and figured that I could spin it as ‘High Availability and passing an audit at the same time’.  I had already done it in my environments so felt comfortable presenting…my relaxed demeanor was one area I received compliments.

The day of the presentation I discovered that the computer I had brought from home had suffered a failure. I was using it, a cross-over cable and my laptop to simulate server to server communication (for the mirroring).  I scrambled to get mirroring working between two instances on my laptop to no avail (some of you saw my #SQLHELP tweet).  The site where the presentation was being held did not offer WiFi, but I had a cellular dongal (borrowed from wife) and could remote into my test servers at work.  This was a last resort; it also meant a live demo.

I arrived early and tested my cell connection: 100% strong with full bars.  Sweet, one potential speed bump behind me.  We went through our normal meeting announcements and then it was my turn…I now had the Conch.  A quick prayer to the DemoGods and I was off and running.

I had way too many windows open.  My font size was too small.  I switched context between servers often. The code was not fully commented.  The last piece of the demonstration failed.

But…

I was relaxed throughout.  I was knowledgeable about the subject.  I kept consistent eye contact with my audience.  I engaged the audience regularly.

For what I had to overcome, it turned out fairly well.  The feedback I received was constructive and positive which is a testament to our Sacramento user group.  It gave me the confidence to want to do another.

I’m looking forward to my next opportunity…this time I’ll choose a topic that can be explained in 100 words or less.

Check out http://sac.sqlpass.org for my code and slide deck.

SQLSaturday #144 coming to Sacramento CA on July 28, 2012


The Sacramento SQL Server users group is happy to announce that we’ll be organizing our own SQL Saturday event to be held in Sacramento on July 28, 2012. Check out http://sqlsaturday.com/144/eventhome.aspx for more details!

Another “Wait on the Database Engine Recovery handle failed” – SQL Server 2000 to 2008R2 upgrade

I was given the task of upgrading our SQL Server 2000 instance to SQL Server 2008R2. I ran through the following steps to make sure the upgrade would be a success:

  • Took a snapshot of the VM
  • Upgraded to SQL Server 2000 service pack 4
  • Installed the updated Windows Installer 4.5
  • Launched the “System Configuration Checker” (found no issues)
  • Ran the “Upgrade Advisor” and addressed all issues

To this point, I figure I’m golden. Running the upgrade, it finishes with the always daunting, ever elusive error:

“Wait on the Database Engine Recovery handle failed. Check the SQL Server error log for potential causes.”

This occurred during the “SqlEngineDBStartConfigAction_upgrade_configc_Cpu32” portion of the upgrade. The error log showed:

Googling took me to a number of sites but this one was the most useful: http://social.msdn.microsoft.com/Forums/en/sqlsetupandupgrade/thread/02e221e5-9a4b-444c-a136-ef3fdbec84e9

Not sure why the answer was marked the way it was, but I had to scroll to the bottom of the post to this reply:

“This issue has been resolved.  It still goes back to the complex password issue.  I had to turn off complex password policy on my domain, run the install repair, and it worked successfully.  It does not help to just make the SA password complex.  There are many blogs out there that point to an account being used in the install script that does not meet the complex password.  I could not find that account in my case, so I had to turn off my policy temporarily.”

I had changed the SA password a few times, made sure it was correct by logging in and actually using that account to perform the service pack 4 installation, and was confident that it met the complexity policy.  According to the above, it’s not the problematic user. Once I disabled the password complexity policy the engine started up without issue.

Some tools that helped me:

To change the ‘sa’ password in SQL Server 2000 (run as a member of ‘sysadmin’, no need for the old password)

EXEC sp_password @new='AG00dlongP@ssword!', @login='sa'

Running the SQL Server .exe from the command line with the -M option gave more information. Change directory to the location of the .exe and issue the following command (see below). It will output messages to a file in that same directory called ‘error.txt’ (useful for debugging as DOS buffer is inadequate):

sqlserver.exe -M > error.txt

“In transit” encryption to SQL Server

To encrypt connections to the SQL Server engine, there are a few options:

  1. IPsec, that encrypts the whole network packet
  2. SSL (Secure Sockets Layer) using “Force Encryption”, which only encrypts the packets sent to the database
  3. IPsec + SSL

“In transit” encryption is the encryption of the data packets at the network level. IPsec does this well, but is set through the operating system and encrypts ALL packets that come and go to the server. If the network stream needed to be analyzed to troubleshoot a problem, IPsec would have to be turned off. This might involve the network analyst, the server admin or a combination of the two. If the DBA does not wear one or both of those hats, it’s extra resources that may have to be tapped.

Using SSL with SQL Server is a relatively easy task. It has the advantages of only encrypting the traffic being sent to the SQL Server engine and can be controlled (for the most part) solely by the DBA.

I won’t go into the implementation of IPsec, but will offer this site.

Encryption using SSL has a few choices: use a certificate from a trusted root authority, create your own certificate or use the self-signed certificate that gets created when SQL Server starts.

A certificate from a trusted root authority needs to be purchased from a vendor such as GoDaddy (Google for more vendors).

You can create your own certificate using the IIS Resource Kit. Here’s a link that has a decent explanation; the Microsoft documentation will have more information.

For the above two options, the certificate will need to be imported into the SQL Server Windows server and into any clients wanting to connect to the database engine.

To install a certificate (from SQL Server Books Online):

  1. On the Start menu, click Run, and in the Open box, type MMC and click OK.
  2. In the MMC console, on the File menu, click Add/Remove Snap-in.
  3. In the Add/Remove Snap-in dialog box, click Add.
  4. In the Add Standalone Snap-in dialog box, click Certificates, click Add.
  5. In the Certificates snap-in dialog box, click Computer account, and then click Finish.
  6. In the Add Standalone Snap-in dialog box, click Close.
  7. In the Add/Remove Snap-in dialog box, click OK.
  8. In the Certificates snap-in, expand Certificates, expand Personal, and then right-click Certificates, point to All Tasks, and then click Import.
  9. Complete the Certificate Import Wizard, to add a certificate to the computer, and close the MMC console. For more information about adding a certificate to a computer, see your Windows documentation.

To enable SSL in SQL Server using the imported certificate, open the ‘SQL Server Configuration Manager’. In the left hand pane, right click ‘Protocols for MSSQLSERVER’ and choose ‘Properties’. (If you’re connecting to a named instance, it will be ‘Protocols for “named instance“‘):

SQL Server Configuration Manager

Choose the ‘Certificates’ tab, then choose your imported certificate from the drop down menu.

Choose the ‘Flags’ tab and set ‘Force Encryption’ to ‘Yes’. SSL will now be implemented once the SQL Server service is restarted.

To enable SSL using the self-signed certificate that SQL Server creates on each service start, simply set ‘Force Encryption’ to ‘Yes’ and restart the service. No certificate needs to be installed (see above).

Notes:

The self generated and self-signed certificates offer the weakest security. The self-signed certificate encryption method does not protect against “Man in the Middle” attacks.

“In transit” encryption could have a performance impact, as there is additional overhead in the encryption and decryption of the packet.

Clients not using the SQL Server connection libraries may not be able to communicate with the engine when encryption is turned on. Tests can be done using the self-signed certificate.

Regardless if “Force Encryption” has been enabled, SQL Server always encrypts the authentication process when a SQL Server login connects to the engine.

Simple restore using T-SQL

Restoring your database is simple using T-SQL.

I begin by using ‘RESTORE FILELISTONLY’ to get information about the files contained in the backup set. It returns a result set with information like ‘LogicalName’, ‘PhysicalName’, ‘FileGroupName’ and ‘Size’ (in 8K pages), among other things. You’ll use this information if you’re renaming or moving the .MDF or .LDF files contained in the backup set:

RESTORE FILELISTONLY
FROM DISK = 'D:\MSSQL10_50.SANDBOX_2K8R2\MSSQL\BACKUP\TestDB\TestDB.bak'

The output will be something like this:

To restore a backup over the existing database, use:

RESTORE DATABASE [TestDB]
-- location and name of the backup set file
FROM DISK = 'D:\MSSQL10_50.SANDBOX_2K8R2\MSSQL\BACKUP\TestDB\TestDB.bak' 
WITH RECOVERY,
REPLACE

The REPLACE parameter instructs the engine to overwrite the existing data and log files. Use this cautiously. If the REPLACE command is not used and the database has changes later than what is stored in the backup set, you’ll receive an error:

“Msg 3159, Level 16, State 1, Line 1
The tail of the log for the database “TestDB” has not been backed up. Use BACKUP LOG WITH NORECOVERY to backup the log if it contains work you do not want to lose. Use the WITH REPLACE or WITH STOPAT clause of the RESTORE statement to just overwrite the contents of the log.”

If the data and log directory are different from the backup set, use:

RESTORE DATABASE [TestDB]
-- location and name of the backup set file
FROM DISK = 'D:\MSSQL.1\MSSQL\BACKUP\TestDB\TestDB.bak' 
WITH RECOVERY,
-- location and name of the data file
MOVE 'TestDB' to 'D:\MSSQL.1\MSSQL\TestDB.mdf', 
-- location and name of the log file
MOVE 'TestDB_log' to 'D:\MSSQL.1\MSSQL\TestDB_log.ldf'

For the above example, I’ve copied a .bak file from my named instance server SANDBOX_2K8R2 to a default instance server. The location of the SQL Server data directory is different on the two servers, so we need to use the MOVE parameter with the RESTORE command.

Be careful with the naming of the data and log files. Although the extensions have no meaning to the SQL Server engine, the convention is to use .mdf for the master data file (and .ndf for any others) and .ldf for the log files. Any other convention could get confusing.

PASS Summit 2011 first timer, three days of awesomeness

My first time attending the Professional Association of SQL Server Summit was amazing! Three days of great people, useful information and free stuff! I had to check an extra bag at the airport to bring home my loot.

I arrived at the conference on Tuesday night, too late to attend the ‘first timer’ event. I registered, then headed towards the dimly lit exhibit hall, where a large crowd of participants were gathered. The complexity of the convention center, coupled with the overwhelming feeling that I was going to miss something had me reeling, but I quickly recovered when I ran into a fellow Sacramento SQL Server User Group board member. He was a second year veteran and seeing my anxiety, began leading me through the crowd of fellow SQL Server-ites, showing me where to get food, grab a drink and talk to the pros. The big screens in the exhibit hall showed the magnified images of the “Jeopardy” contestants, SQL Server MVPs who were playing a mock game show using SQL Server, popular culture and everything in between as the questions to the given answers. We eventually found other members of our group; at this point my anxiety had been replaced with sheer excitement.

The night was young and there were some after hours events to attend. Our lil band dispersed, each heading to pursue his own agenda but with promises of regrouping the next day. My itinerary lead me to the SQLServerCentral.com casino night, hosted by the popular website and RedGate software. As I contribute to and frequent that site regularly, I was looking forward to putting faces to the various names that I see everyday. My first encounter was with “the man himself”, Steve Jones. We shook hands and exchanged some pleasantries and I sauntered in to join the festivities. Jeff Moden was accepting his “Exceptional DBA” award (truly earned) and after some “gambling” and the raffle, Steve and I were able to talk about life, professions and the Seattle underground (he told me about the underground tour). Getting to speak with someone I admire really made my night; the bar was set high for the rest of the week.

Wednesday, October 12th: Day 1

10:15 – Thomas Grohser presented on SQL Server I/O basics. I took pages of notes, reinforcing some of the things I already knew and absorbing the points that I was fuzzy on. Forgot to fill out an eval =(

13:30 – Victor Isakov filled us with information on indexing. It was meant to span two sessions, but after the first hour and a half, I opted to find another topic.

15:00 – Andy Warren gave tips on how to build a Professional Development Plan. This one really hit home as I’m always looking for the “next adventure”; here I was given practical advice on how to game plan to get to my next level.

16:45 – Aaron Betrand ended the day with an outline of new features in SQL Server 2012 (“Denali”).

Thursday, October 13th: Day 2

10:15 – Klaus Aschenbrenner gave a very practical session on troubleshooting performance problems with SQL Server 2008. He outlined an easy to follow recipe which I really appreciated learning.

Lunch – Women in Technology – The WIT group fed us and treated us to a discussion on various topics. Awesome time and great food.

13:30 – Victor Isakov again, presenting on Trace Flags that every SQL Server DBA should know. The information was useful, but I wish I would have gone to the Thomas LaRock presentation. I’ll make sure to watch that one on the DVD

15:00 – Tim Ford showed his “Periodic Table of Dynamic Management Objects”, truly entertaining for a science nerd like myself. I’m hoping I can find the poster online.

17:00 – Maciej Pilecki gave an in depth presentation on execution plans. Very, very useful information.

Friday, October 14th: Day 3

10:15 – Brad McGehee taught us the inner workings of the Transaction Log, giving us tons of useful best practice and tips to managing one of the most important aspects of SQL Server.

13:00 – Il-Sung Lee outlined some security changes in SQL Server 2012, as well as reviewing the changes that SQL Server 2008 gave us. My organization is heavy into security so I was glad to attend.

14:30 – Brian Mitchell and Dandy Weyn presented on SQL Server Parallel Data Warehouse. I can now go back and argue with my Oracle friend on the virtues of PDW vs Exadata.

16:15 – Thomas LaRock showed SQL Server memory management.  I unfortunately left early to catch my plane

 

All in all an amazing experience.  I was able to meet some of the big names in the SQL Server community: Andy Warren, Jorge Segarra, Steve Jones, Brian Davis and learn tons about my chosen profession.  Everyone was super friendly, sharing hand shakes and experiences as well as a business card now and then.  We ate, drank and built stronger relationships.  I’m truly looking forward to next year’s event.