SQL Server 2016 – Database mail queued but not sending

Have you installed SQL Server 2016, enabled Database Mail and not been able to send a test email?  Does the email queue show a list of queued mail items but no entries in the Database Mail log?  This may help.

I installed SQL Server 2016 with CU1, used a domain user as the owner of the SQL Server accounts and gave that domain user local administrator privileges. I turned on database mail and created a profile and email account to send mail. When I attempted to send a test email, no test email ever arrived. Hmm, time to start troubleshooting.

First, I checked the Database Mail log. This was accomplished by going to the Object Explorer window, expanding ‘Management’ and right clicking on ‘Database Mail’ and choosing ‘View Database Mail Log’:


How to view SQL Server Database Mail log

No entries. Hmmm.

Next was to see what MSDB thought about the process. Querying MSDB using the built-in mail stored procedures and system tables yielded little results:

-- This one told me that Database Mail was started
EXEC msdb.dbo.sysmail_help_status_sp;
-- Here I learned that there were 5 items queued and the last times I tried sending mail
EXEC msdb.dbo.sysmail_help_queue_sp @queue_type = 'mail';
-- This confirmed none of the email was sent
SELECT * FROM msdb.dbo.sysmail_sentitems;
SELECT * FROM msdb.dbo.sysmail_unsentitems;
-- Is Service Broker enabled? It has to be to send mail
SELECT is_broker_enabled FROM sys.databases WHERE name = 'msdb';
-- I tried stopping and restarting the Database Mail exe
EXEC msdb.dbo.sysmail_stop_sp;
EXEC msdb.dbo.sysmail_start_sp;

The domain account responsible for running the SQL Server service and agent was a local administrator. I made it an instance administrator and restarted the instance. Still no success. And since I (and the service accounts) were both in the ‘sysadmin’ server role, permissions and ownership in the MSDB database role ‘DatabaseMailUserRole’ were not the issue.

My last attempt was to check the Database Mail executable directly. A Windows search for ‘databasemail’ showed it to be in “C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\Binn”. Double clicking (or right click and ‘Run As Administator’) the executable showed me that .Net 3.5 was not installed:


.Net Framework 3.5 is not installed


This was a Windows Server 2012 R2 box, so I had to add .Net 3.5 framework.  Which meant I had to dig up the Windows install media (there may be ways to do it without but I didn’t dig that deep in Bingle).

Opening Control Panel and choosing ‘Turn Windows features on or off’ brought up the wizard. control_panel_windows_features

Choose ‘Role-based or feature-based installation’ > ‘Next’.  Select the server and click ‘Next’.  Leave all boxes blank on the ‘Select server roles’ screen and choose ‘Next’. On the ‘Select features’ screen, check the box for ‘.Net Framework 3.5 features’ and click ‘Next’.

The ‘Confirm installation selections’ screen will have a warning at the top beginning with ‘Do you need to specify an alternate source path?…’.  Below the big white box in the middle of the windows, click the link ‘Specify an alternate source path’; this brings up a new window.  Put the path to the Windows media.  For me, I’d mounted the disc to E:\, my source path was ‘E:\sources\sxs’.


Supply alternate source path

Click ‘Install’ (the above came from https://technet.microsoft.com/en-us/library/dn482071.aspx, which claims not to be maintained).

Once .Net 3.5 was installed, I went back and double clicked the ‘databasemail.exe’ from above.  All my mail that was queued was delivered.  And all test emails were now sent successfully.

These help with troubleshooting:

This one has a comment at the end that helped point me in the right direction and says that CU2 may resolve the above issue:

A quick look at MongoDB – part 1

NoSQL databases are all the rage lately, with some touting them as the demise of the traditional RDBMS.  I don’t share that view, but thought it worthwhile to begin exploring the new technologies.  A coworker of mine had gone through a seven week online course for MongoDB through MongoDB University (https://university.mongodb.com/) and was very happy with the outcome, so I thought I’d give it shot, also.

The course was free and took 2 – 4 hours a week to complete.  It consisted of a video “lecture” with ungraded quizzes at the end to help reinforce the material.  Each week had a few homework assignments to turn in.  Some of the material was dated but they made a strong effort to point out those areas.  I was able to complete the course with a minimal amount of effort, but felt very good about the knowledge I’d gained.

So why MongoDB?  It was created by a group of developers, so one of the first attractions to the technology is that you interact with the data through JavaScript.  A developer doesn’t have to worry about learning SQL, they can stick with their “native” language.  Additionally, all data is stored in JSON (JavaScript Object Notation) documents, with which most programmers are already familiar.  The underlying data structure of the “database” more closely resembles OO programming constructs (a good article here).  One more attraction to the technology is that interaction with the data is ‘schema-less’; there is no set structure that needs to be adhered to when getting information out of the JSON documents.  But, one of the main advantages of MongoDB is that it scales well horizontally, without the need to purchase high end hardware.  This is ideal for dealing with “Big Data” (yes, I used the buzzword).

How does MongoDB scale on commodity hardware?  It uses a concept called “sharding“.  Think “partitioning”.  Data is separated physically through the use of a “shard key” and access is managed through a central process.  Each server contains a subset of the data, “partitioned” across each node.  Node one would contain IDs 1 through 1,000,000.  Node two would have 1,000,001 through 2,000,000.  Node three 2,000,001 through 3,000,000, etc:

MongoDB shard
In addition, MongoDB uses the concept of “replica sets” (think “replication”).  There are multiple servers in a replica set which contain a copy of the data, this allows for read only secondaries to help distribute load.  It also maintains high availability:

MongoDB replica set

A blending of the two configurations creates a highly available, high performing environment:

Replica set and sharding


Servers can be added or removed without interruption of service.  It’s an inexpensive way to scale horizontally.  It also allows for maintenance and high availability.  I can do a “rolling” patch or upgrade to servers while still allowing access to the data.  I’m also not completely down if a server goes offline.

This was part 1 of a series of blog posts, focusing on a brief overview and the architectural advantages of MongoDB.  Stay tuned for more posts.


SQL Saturday #540 in Sacramento CA – silky smooth…

Another year in the books, another amazingly successful event!  SQL Saturday #540 in Sacramento was our 5th endeavor and was arguably our best effort.  No law enforcement involvement, no fires or natural disasters and no one leaving dissatisfied.sqlsat540_web

This year boasted an international speaker lineup; it was great to meet Cathrine Wilhelmsen (B|T) from Norway.  We had a good showing of first timer speakers to Sacramento, thanks to Simon Cho (B), Anthony Nocentino (B|T), Reeves Smith (B|T), Bill Fellows (B|T), Elizabeth Bock (B|T), Meagan Longoria (B|T), Sayed Saeed (B|T) and Tim Plas (B|T).  Sacramento was well represented; shout out to Chris Anderson (B|T) for being a first time SQL Saturday speaker and contributor to the event.  Dilip Nayak (T) and I did a tag team session on NoSQL and MongoDB that was well received, although buzzword bingo did not seem to go over as well this time.  Pro tip: make sure to not get a session right after lunch, it will limit the amount of nodding heads in the audience.

Over 150 attendees were recorded, making this our second highest attendance level.  While the number of sponsors was lower this year, the support that they gave enabled the event to be great success.  Thanks to Tintri, DELL software, Microsoft, DesignMind, Plus Consulting, Idera, PASS, Devart, Robert Half and Sutter Health.  With the temperature outside going into the triple digits (I think the high that day was 103 F), the feeling inside was a constant “78 and sunny”.  Speaking with the team, we all agreed that this may have been our smoothest run event to date.

Thanks to all the speakers, sponsors and attendees.  Please visit the Sacramento SQL Server user group LinkedIn page to give us any feedback.


See you next year!


SQL Saturday #312 – Sacramento, CA

SQL Saturday #312 will be held July 12, 2014 in Sacramento CA, our third annual event.  I’ve chosen to join my Sacramento SQL Server user group board members and participate in a “lightning talk”, outlining a how to grant permissions to users through the signing of stored procedures.  It’s a great topic and can be put into use very easily and quickly, with a good bang for your buck.

Mitch Bottel (B|T), Dilip Nayak (T) and Joe Simmons (T) will be joining me.  Check out the abstract at http://sqlsaturday.com/viewsession.aspx?sat=312&sessionid=22630.

Hope to see you there!

T-SQL Tuesday #049: What’re we waitin’ fer?!?!?

What're we waitin' fer?!?!?

What’re we waitin’ fer?!?!?

Had to channel my inner Mickey for the title of the post…Rocky II is one of the first movies that I remembering seeing as a re-run on cable…but I digress.

When I read Robert’s post and subject for this month’s T-SQL Tuesday, my mind immediately went to one of my first experiences with wait stats, as it was a very poignant “ah-HAH!” moment for me.  I was an administrator for several SQL Server instances and had been doing it for a while, but knew very little about SQL Server’s wait stats.  A few nights before my incident I was reading through Jonathan Kehayias’ (B|T) “Troubleshooting SQL Server: A Guide for the Accidental DBA” which has a section devoted to finding and diagnosing issues around waits.  I had saved the supplied queries and was developing a method for saving wait information and analyzing it over time, to help with my administration.tsql2sday

So there I was, at work taking care of whatever that was the radar that day.  Cue my users: emails and phone calls began coming in, telling me that our campus imaging database was unresponsive and that the application was down.  Our monitoring software wasn’t reporting any unusual resource strains on the instance and the other databases weren’t having any problems.  Enter my shiny new query: I ran it and found a wait on BACKUPTHREAD.  It turned out that my hourly backup of the imaging database had hung up…using the very same query I found that I wasn’t having any problems with I/O.  I was unable to kill the backup process and a phone call to Microsoft had me restart the instance.

It was awesome to be able to use something that was pretty fresh in my mind and be able to get to the root of an issue in just a few minutes.  I remember thinking “heck, that was pretty badass.”

Thanks to Robert Davis (B|T) for this month’s topic.

Here’s some more information about T-SQL Tuesdays (#tsql2sday on Twitter)


T-SQL Tuesday #047 – The stuff we all get!

My story isn’t so much about quality as it is about quantity.

PASS Summit 2011 was my first time attending that most of awesome of events and I was a bright eyed noob.  I had no idea what to expect and was already a bit overwhelmed with travel accommodations, light rail rides and hotel check-in.  Walking into the Tuesday night welcome reception did nothing to ease the nerves.


At the reception was my first “score”, a blue umbrella.  After the reception was casino night with SQLServerCentral.com…this yielded an iPod shuffle and a t-shirt.  Two hours into the trip and I was already playing with house money.

Over the next three days, I had a pile of t-shirts, 2 hats, some rubber duckies, a new backpack, a flashlight, bouncy balls that lit up and more pens and note pads than Kinko’s.  I nearly had to check an extra bag to haul all my booty home.  My wife loved it ‘cuz I’m a “t-shirt and jeans” kind of guy….she wouldn’t have to buy me clothes for another year.


Don’t be a tool…

My favorite shirt was bright orange and in misaligned letters on the front read “Don’t be a tool…”.  It’s still my go-to camisa for Saturdays around the house.

I’ll be attending this year’s Summit and am looking forward to more SWAG!

Thanks to Kendall Van Dyke (B|T) for this month’s topic.

Here’s some more information about T-SQL Tuesdays (#tsql2sday on Twitter):


SQL Saturday #190 – B-I-N-G-O spells “murder”

When an event like SQL Saturday #190 in Denver is announced, people listen.  Speakers such as Steve Jones (B|T), Wendy Pastrick (B|T), Tim Ford (B|T) and Tim Mitchell (B|T) draw crowds.  When I found out that I was chosen to speak, I was ecstatic.

SQL Saturday #190 - Denver CO

SQL Saturday #190 – Denver CO

My topic was boring ol’ SQL Server encryption but I decided to try and spice it up with a variation of “buzzword” Bingo.  I created a 3×3 playing card with phrases and pieces of code from the presentation.  Attendees had to get all 9 squares to win a Starbuck’s gift card.  The idea was to help reinforce the topic and subject matter.  My plan was working until 6 players raised their hands (BINGO!) but I only had 5 cards to give away.  Next time, I’ll bring a bag of candy.

The feedback I received from attendees was overwhelmingly positive, with some very good constructive criticism.  Everyone loved the BINGO…most wished we had more time to dive deeper into the subject.  I’m thinking this may be the perfect opportunity for a pre-con…

My favorite presentation of the day was Doug Lane’s (B|T) “SQL Server Murder Mystery Hour: Dead Reports Don’t Talk”.  Attendees were grouped up in teams of 5 and were tasked with deciphering the mystery of some missing reports.  Character actors Jason Horner (B|T) (“Lon E. Wolf”, the developer), Tim Ford (“Grant Essay”, the app vendor), Wendy Pastrick (“Pat Locke”, the DBA) and Bill Fellows (B|T) (“Bill Freeley”, the consultant) were interrogated by Doug to find the culprit.  Mock emails were passed around giving more clues to the actions of the various players.  In the end, it was determined that Lon acted nefariously….Lon chose to run screaming down the halls rather than face the consequences of his actions.

Hats off to the Denver SQL Server users group (http://denver.sqlpass.org)  for a well run, well organized event.  And thanks to them for choosing me to present.  I’m looking forward to next year.

Find SQL Server services running on a remote server using Powershell

You’re given a list of Windows servers and told they host multiple SQL Server instances.  How do you find which services are running?  Use Powershell:


get-service mssql* -computername myServer


It will give a nice list of services that have MSSQL at the beginning of their names:

Status Name DisplayName
Stopped MSSQLServerADHe… SQL Server Active Directory Helper
Stopped MSSQLServerADHe… SQL Active Directory Helper Service

A shorter version:

gsv mssql* -cn myServer

SQL Saturday #190 – Dude, where’s my keys???

This weekend marks my first time attending a SQL Saturday outside of my home state of California.  It also marks my second presentation opportunity at a SQL Saturday.  Come see “SQL Server encryption 3 ways – in transit, at rest and in the database” in Denver, CO at SQL Saturday #190.

I’ve given this talk a few times already but am going to put a new spin on this one: buzzword bingo.

You may remember the IBM commercial where the office crew is lamenting their forced attendance at a company wide vendor presentation…the meeting becomes slightly more appealing when “bingo” cards are passed out.  You earn “spots” when the presenter throws out a marketing IT buzzword (“value adding”,” synergistic”, “organic growth”).  The commercial ends with one of the crew shouting “bingo” in the middle of the sales pitch.   During my presentation, I’ll be spitting out phrases, T-SQL commands and buzzwords to explain the topic.  Each attendee (if they so choose) can grab a bingo card and play along.  The first to line up (or maybe “black out”) a card will win a prize…A NEW CAR!  (just kidding, sorry Bob Barker).  My hope is that the bingo will help reinforce a potentially very complicated topic.

You can register for the event here: SQL Saturday #190

There will also be some movie trivia, so make sure to come learn some new things and have fun doing it.

T-SQL Tuesday #046 – Rube Goldberg in order to SQL Server

tsql2sdayMy first T-SQL Tuesday post and am I excited!  My story is going to vary slightly from the original subject in that I’m going to outline a Goldberg machine that was created in order to CONNECT to SQL Server, showing how much my former employer valued the Microsoft database.  Thanks to Rick Krueger (@DataOgre|B) for a great way to tell some fun IT stories.


Flash back to the ’80s where a young man creates a general ledger accounting program for the California state university he’s working for.  The app was created using the MPE programming language, connecting to an IMAGE/SQL database on a then-cutting edge HP3000 server.  The app is so successful that the young man starts a company and begins selling his product.  He recruits students from the college and works in his garage.


As the industry evolved and sales grew, the application needed to scale and outgrow it’s dating hardware: enter the world of UNIX.  Cobol is chosen as the new, state-of-the-art language and Oracle or Informix is now the database.  Customers could choose any flavor of UNIX they liked, as long as the Microfocus Cobol compiler was compatible and Oracle/Informix would install.  But, what to do with the old MPE -> IMAGE/SQL code?

Instead of re-writing, the company chose to create a “transport” layer of code that would allow MPE  to co-exist with Cobol, making calls through that layer to the “shiny new database”.  All new development was done in Cobol, but the MPE code was so fundamental that it still played a HUGE role in the functionality of the application.


The industry continued to evolve: enter the brave new world of the internet.  Instead of clunky telnet clients that had to be installed on each desktop, the company decided it needed to create a web application, one where clients only needed an internet browser to access an internally hosted IIS server.  But IIS is a Windows product and although all new development was being done in .NET, the MPE/Cobol code still was the backbone of the product.  For most modules, the .NET code was making calls to the MPE/Cobol code that then interacted with the Oracle/Informix database living on the UNIX server.

mouse-trap-game-board-i4Which one of these do you picture as the bath tub filling with water to tip a scale to release a ball bearing that slides down a ramp to…


In order to remain current in the industry and reduce costs for clients to purchase necessary hardware to host the app, SQL Server was offered as an alternative database.  But legacy code is legacy code and MPE “transported to” Cobol does not play nicely in Windows.  The solution: emulate UNIX on Windows!  Using MKS (a UNIX emulator app for Windows), the MPE/Cobol code could now be used to “interact” with the SQL Server database.  The interaction was done using SQL Server logins (no sense in leveraging Active Directory at this point) and reporting was done using the proprietary solution offered with the software (SSRS???).

Interestingly enough, the company was successful to the point that it attracted the attention of a corporate giant and was sold for a handsome sum.

In summation, “you can’t tell a book if the title’s covered” (http://www.imdb.com/character/ch0029466/quotes).

Hopefully you never had to support this type of machine.  But, I do look back on those help desk days with fondness, as I had no clue what I was doing but was surrounded by great people.