While at Tech Ed 2011 I spent a good amount of time talking with Adrian Bethune (@canon_sense) who is the new product manager for SQL Server Manageability, originally hired onto the team by the magnificent Dan Jones (blog | twitter) who was smart enough to run screaming from the team towards an awesome new position at Microsoft. Adrian was crazy nice enough to take some time and sit down with me for a bit for an interview so that he could be introduced to the community at large (and so that everyone knows where to throw their rotten fruits and vegetables).
[Denny] As someone I’ve meet a couple of times now, I know a little about your history at Microsoft, but the good folks of the SQL community don’t know much about you as you’ve done a pretty good job staying out of the public eye, until now. Can you tell us a little about your life at Microsoft and how you got here?
[Adrian] I finished my CS degree at University of Illinois [UIUC] in 2007 and came to work in the build and test infrastructure team here in SQL Server for a few years to get some experience building some enterprise scale services and applications that get deployed and used right away. In the infrastructure team I worked on the test and build automation systems that pump millions of tests on hundreds of builds every day. The coolest projects I worked on included designing, deploying, and migrating to a next-gen distributed build system as well as automated storage management and provisioning services that work on top high-end hardware. As Microsoft’s SQL Server strategy shifted to include the cloud and focusing on reducing the cost of developing and maintaining SQL Server, I saw a great opportunity in the SQL Manageability team to get into the thick of it so I joined the team last June.
[Denny] DACPAC has a pretty sorted history with the v1 release being looked upon less than favorably (it may have been compared to a steaming pile of something, or Windows ME, etc.). What brought you to the team and made you want to take this project on?
[Adrian] While the first implementation did leave something to be desired as you subtly point out, four important points drew me to this area. First, the entire DAC concept is new and is therefore more of a startup environment than the typical monolithic product development team where you get pigeon-holed very quickly. Right out of the gate we were heads-down on shipping a major feature as soon as possible – in-place upgrades – in VS 2010 SP1 and SQL 2008 R2 SP1. Second, the concept of DAC and the services it provides is appealing even if the first implementation is not ideal. The way I see it, DB developers have become accustomed to having to develop on this cumbersome stateful beast by continuously executing differential scripts which modify the state of their application (schema). With the push towards minimizing the cost of managing databases, developers and DBAs need serious tools that help reduce the burden of managing and deploying databases so they can focus on working on real innovation. DAC has the potential to become one of the key pillars in the drive to drop costs. Third, the engineering team behind DAC is staffed with some top development and test talent. The DAC team is a serious engineering team that has a passion for demonstrable quality and the drive to push multiple agile releases therefore it’s a fun and exhilarating team to work with. Over the next few months you’ll see some exciting announcements and developments in the DAC space, both with and for a multitude of partners and products within Microsoft as well as integration into the tooling and services for SQL Azure. Lastly, the partnerships and engagements within SQL have been fantastic. DAC is not just a SQL Manageability initiative, it’s a SQL initiative with some great work from the Engine team on providing a set of compilation services to validate the DAC model as well as moving the needle towards containing the database. Together with the Engine team we will provide a symmetrical model for a database (application) in the runtime environment (contained databases) and the logical definition (DAC – dacpac). Check out the DAC/CDB session from TechEd for more info on the roadmap – http://channel9.msdn.com/Events/TechEd/NorthAmerica/2011/DBI306. In the session you’ll see how the Engine and DAC teams are working towards a common vision to make developing and managing cheaper.
[Denny] From what you’re saying it sounds like some Microsoft products will begin using DAC and DACPAC to deploy applications. Does this include customer shipped applications such as Dynamics and SCOM or just internal applications? (We can edit this to not list any specific products if needed.)
[Adrian] Besides several internal teams picking up and using DAC services for their own purposes, shipping products will also be integrating with it. Publically, System Center Virtual Machine Manager 2012 has already shipped a beta with DAC integration. At TechEd, the AppFabric team announced their new composite application model initiative/project which also integrates DAC as the data-tier component for applications. Expect to see more products integrate DACFx in the coming months. That’s all I can say for now.
[Denny] If people have feedback on Data Tier Applications (DAC) or DACPAC what’s the best way to get that to the team?
[Adrian] The broader community can always engage us with connect bugs or on the MSDN forums but for MVPs and folks I interact with, feel free to shoot me a personal mail.
[Denny] Knowing the abuse that we’ve given our good friend Dan Jones (a.k.a. DACPAC Dan) did that make you hesitant to take on DAC and DACPAC?
[Adrian] Sure, it would give any reasonable person pause, however, my own personal estimation of the potential value of DAC and the chance that we could align with our partner teams in the Engine, Juneau, Visual Studio to provide a single surface area for development which enables some key management features trumped my reservations. While I can’t disclose much more than we talked about at TechEd, I can say that the reality has met the potential and it’s exciting to see how the future is shaping up.
[Denny] So when you aren’t being abused by the MVPs, and you are permitted to actually leave the confines of building 35 what sort of things do you fill your 10 minutes of daily free time that Steve Ballmer allocates to you?
[Adrian] From time to time they do let us out but only enough so people don’t file missing persons reports. In my spare time I hang out with the wife, dabble with gadgetry, swim, read quite a bit (Sci-Fi typically) and follow economic and political news and trends.
[Denny] Are there any other projects that you are working on that you can share with us that’ll be included in the SQL Server “Denali” release or maybe even earlier?
[Adrian] After ramping up, I spent the latter half of last year working on shipping DAC v1.1 that includes in-place upgrades as soon as possible, which means we actually shipped in Visual Studio 2010 SP1 and will ship in SQL Server 2008 R2 SP1 (CTP available today). Once we shipped 1.1, I worked on getting the import/export services up and running and we shipped a CTP currently available on www.sqlazurelabs.com which you may have seen at TechEd. In parallel, I am working on an import/export service for SQL Azure which will provide import/export as a service (rather than from client side tools) that will import or export to/from Azure BLOB storage without the need for client side tools. Apart from that, I’ve been very busy working on partnership engagements within Microsoft because DAC provides a nice and cheap way for other product teams to operate on and with SQL Server and SQL Azure.
[Denny] I’m interested in this Azure Import/Export utility. The BLOB storage that this will integrate with (keeping that I don’t know much about Azure besides the SQL Azure part), how would one get a file uploaded to that automagically? Can you FTP files to it, or is there an API which has to be used, etc?
[Adrian] There is an API you can use, however, there are quite a few tools which will synchronize folders between your client machine and your BLOB storage account. That’s the easiest way to get your files into the cloud. I won’t mention any specific tools broadly to avoid favoritism/politics, however a quick search for “azure storage tools” is a good starting point. Keep in mind that the only time you need to transfer the import/export artifact – a BACPAC – between your client and the cloud is when you want to migrate or move your database between the cloud and on-prem environments. Otherwise, you can just keep your files in the cloud in your BLOB account and use our services to operate over them. Sounds like a good topic to cover in a session…
[Denny] If v2 of DACPAC blows up, would you prefer to be slow roasted over gas or open coals?
[Adrian] That depends. Is the purpose to inflict pain or are you of the cannibalistic persuasion? Honestly, as MVPs are some of the most seasoned SQL consumers, we’d love to hear your feedback on the new upgrade engine as well as the overall application lifecycle experience that DAC enables. We are a nimble team and if there’s a great opportunity to incorporate fixes for our services for the Denali release. Unfortunately, because we were so focused on DAC 1.1, we didn’t have enough time to deliver a lot of DAC value in Denali CTP1, however, CTP3 coming this summer will be fully DACified and include all the latest and greatest including SQL Engine validation, in-place upgrades, and full support for SQL Azure application scoped objects including permissions and roles!
[Denny] It is pretty clear that DAC and DACPAC is geared mostly towards SQL Azure as it supports the current SQL Azure feature set. Can you tell us a bit about why the decision was made to push DAC and DACPAC as being an on premise solution instead of keeping the focus for it on SQL Azure until it was ready to support a fuller on premise feature set?
[Adrian] Fantastic question. The reason it was positioned as an on-premise solution is because the SQL Azure story was still being written. If you rewind back to the days 2008 R2 was working towards release, SQL Azure started out with this simple DB concept and was then reset to provide full relational features. At that time, we really weren’t sure if we wanted to dock the DAC roadmap to Azure because the SQL Azure story was in flux. So the fallback position was to tie the DAC story to the box product because we weren’t able to really commit to a direction for DAC and Azure. Since then, we’ve been straightening the story in a big way with partners and at TechEd.
[Denny] When we were hanging out at Tech Ed 2011 you seemed like you wanted to become more involved in community. Did I guess this one right? Will you be joining us at events like PASS and the MVP summit for some “learn’ and camaraderie”?
[Adrian] Yes, I certainly hope to join you at PASS and have another couple sessions at the next MVP summit but don’t know with certainty yet.
[Denny] The most important question, would you prefer to be known as “DACPAC Adrian” or “DACPAC Dan 2.0”?
[Adrian] The former. There’s already a “DACPAC Dan 1.0” and we haven’t tested any side by side or upgrade scenarios. 🙂
I’d like to thank Adrian for being a sucker good sport and agreeing to sit down with me, even knowing the beatings that I’ve given Dan over DACPAC v1. I hope that everyone enjoyed reading this interview as much as I enjoyed talking with Adrian.
All joking aside Adrian is a great guy, and a lot of fun to hang out with, and he’s got some great vision for Data Tier Applications and DACPAC. I just hope he’s able to pull off what he’s got planned. If not, we’ll be having a BBQ at the next MVP summit and Adrian will be the “guest of honor”.
Denny