Wednesday, December 26, 2007

Team Foundation Server Power Tools - TFS 2008

For those of us who actively use Team Foundation Server and who've updated to TFS 2008, these tools are a must-have! It's good to see that Microsoft has really kept up with the need to deliver these add-ons quickly!!

The tools can be found here: http://msdn2.microsoft.com/en-us/tfs2008/bb980963.aspx

Concurrent with the release of the power tools, Microsoft has also released an update to the TFS Source control provider. They have been pretty quiet about this, but the provider fixes several rather annoying bugs. It can be downloaded here: http://www.microsoft.com/downloads/details.aspx?FamilyId=FAEB7636-644E-451A-90D4-7947217DA0E7&displaylang=en

Monday, December 24, 2007

Addskills Expert Seminar - Analysis Services Deep Dive - Stockholm

I just returned from leading a 2 day "Experts Seminar" hosted by Addskills (Formerly Jonsson & Lepp) in Stockholm, Sweden. The course was well attended and was quite challenging. I will use this blog entry to post any additional materials and answer any further questions that delegates who attended the course have.

Download the slide deck here: Expertseminarium SSAS 071217-18.pptx

Download the additional materials here: Analysis Services Expert Seminar - Additional Materials.zip

For those that attended the seminar, please don't hesitate to send any questions my way, either via the comments here, or email at ted.malone@gmail.com

Saturday, December 15, 2007

SQL Connections - Spring, 2008

I will be presenting a session on the SQL 2008 Resource Governor at the spring SQL Connections conference, which is in Orlando, Florida April 20-23rd, at the World Center Marriott.

If you're interested in learning about SQL Server internals, this will be a good session for you to attend. Here's the abstract for the session:

Introduction to the SQL Server 2008 Resource Governor

With multiple workloads on a single server, administrators must avoid problems such as a runaway query that starves another workload of system resources, or low priority workloads that adversely affect high priority workloads. SQL Server 2008 includes the Resource Governor, which enables administrators to define limits and assign priorities to individual workloads that are running on a SQL Server instance.

Attendees will gain insight into the need for the Resource Governor, will see live demonstrations of the Resource Governor in action, and will learn necessary tips and tricks on how the resource governor can be used to solve real-world problems.

Thursday, December 13, 2007

Visual Studio 2008 "Data Dude"

Gert Drapers just posted a "What's new in 2008" entry on his blog. Basically there's not a whole lot of new functionality, but several bug fixes and things of that nature. See the entry here: http://blogs.msdn.com/gertd/archive/2007/11/21/visual-studio-team-system-2008-database-edition.aspx

Saturday, December 8, 2007

People over Process

In an earlier blog post, I mentioned some of the tenets of Agile development and how they applied to Business Intelligence projects. One of the tenets that is close to my heart is "Individual Interactions over Processes and Tools" Basically, what this tenet says to me is that any Agile process must be constructed to value the people more than the process. To me this just screams common sense, but I am truly amazed at how uncommon it seems to be these days.

My team is a fairly small team, but we've built a product over the last year that has already been sold and deployed into our customer base and has generated significant revenue. This despite the fact that many of the team members had never even heard of the technologies that I was asking them to master (Not to mention the fact that when we started the project, NONE of the technologies we employed were more than Beta). I honestly (and yes, I know I'm biased) don't know how you could be more successful with a project. Some think we got lucky, some think we just chose an easy path (this is the funniest as far as I'm concerned, I guarantee you that those who say this would have failed miserably given the same task) and there are those who think it was a combination of both...

In my spare time, I'm part of a process-improvement team who's charter is to construct an Agile process that will help us get products to market faster and more reliably. Part of the process is developing various work items and work product that can be defined and rolled out to a wide audience. I was in a meeting the other day that started off innocently enough, we were reviewing a specific work product type that we want to use across all Agile projects. I looked at the output and thought, "wow, I wouldn't have thought of this that way, but it looks pretty good, I like it and can use it".. By the end of the meeting, the work-product had been torn to shreds and all sorts of workflow and parent/child relationships were now included. All I could think of as I left the meeting was the fact that had I been forced to use this thing that was just constructed, there would be no way we'd have ever gotten off the starting line in my current project. This experience has caused me to come up with a set of 2 simple rules for any Agile project:

  1. The process should be minimally invasive, or even invisible
  2. When all else fails, see rule #1

If the tools surrounding a process are too rigid or too invasive, developers are NOT going to use them. Again, this is COMMON SENSE as far as I can tell....

Ok, I guess I'm done ranting for now....

SQL 2008 Resource Governor

As you might have noticed, in an earlier post I mentioned several new CTP5 features for SQL Server 2008, "Katmai". One of these new features is the Resource Governor. For those of us who've worked with SQL Server in larger-scale production systems, the Resource Governor is a godsend. I've recently had the opportunity to play around with the Resource Governor and thought I'd share some concepts here.

Resource Governor Overview

Simply put, the Resource Governor is a component of SQL Server 2008 that allows administrators to configure pools of resources (Think IIS application pool, although it isn't quite that simple) and set specific limits on those resources. There are 2 main components that make up the Resource Governor:

  • Workload Group - A Workload Group is a container that is used to limit certain SQL Server-specific resources, such as Degree of Parallelism or user requests.
  • Resource Pool - A Resource Pool is a collection of system resources such as memory or cpu.

A Workload Pool is assigned to a Workload Group, which is assigned to the Resource Governor.

Configuring Resource Pools

Resource Pools (at least in CTP5 of SQL Server 2008) consist of settings for the minimum and maximum CPU and Memory utilization. The following code creates a pool named "limitedPool" and sets the max memory utilization to 50% and the max CPU utilization to 30%.

CREATE RESOURCE POOL poolLimited
WITH
(
MAX_CPU_PERCENT=30,
MAX_MEMORY_PERCENT=50
);


You can also set the MIN percentage of either or both of these settings as well.






Configuring Workload Groups



Workload Groups are a little more complicated because they allow you to control some specific SQL Server related conditions, as well as apply Resource Pools. The following code creates a Workload Group named wrkGroupLimited:






CREATE WORKLOAD GROUP wrkgroupLimited
WITH
(
IMPORTANCE = MEDIUM ,
REQUEST_MAX_MEMORY_GRANT_PERCENT = 50,
REQUEST_MAX_CPU_TIME_SEC = 300,
REQUEST_MEMORY_GRANT_TIMEOUT_SEC = 300,
MAX_DOP = 8,
GROUP_MAX_REQUESTS = 30
)
USING poolLimited;


This code sets the overall importance of the group to "Medium" which is the default setting (This is an enumeration that is used very similar to the OS "priority" setting for an execution thread), limits each request to 50% of the memory size (no single request can consume more than 50% of the memory allocated to the group), sets both the maximum query time and the maximum "wait for resources" time to 5 minutes, sets the maximum degree of parallelism to 8, and sets the maximum number of outstanding requests in the group to 30. The "poolLimited" Resource Pool is assigned to this group as well.








Creating a Classifier Function



In order for SQL Server to be able to use the Workload Groups that you've created, you need to classify each connection to determine which group that specific connections falls into. This is done through the use of a Classifier Function. Classifier functions are new in SQL Server 2008 and execute each time a new connection is made to the server. Classifier functions are scalar user-defined functions and are basically used to return the name of the target Workload Group for each user connection. The following code creates a Classifier Function that tests to see if the application making the connection is Management Studio, and if so, assigns the connection to the "wrkLimited" Workload Group. If the application is not Management Studio, the default Workload Group is used.






CREATE FUNCTION fnsRsGovMgr() RETURNS SYSNAME
WITH SCHEMABINDING
AS
BEGIN
DECLARE @grpName SYSNAME
IF (APP_NAME() = 'Microsoft SQL Server Management Studio')
SET @grpName = 'wrkgroupLimited'
ELSE
SET @grpName = 'default'
RETURN @grpName
END;






Assigning a Classifier Function to the Resource Governor



The final step in the configuration of the Resource Governor is to assign the classifier function and activate the configuration. The following code assigns the fnsRsGovMgr function to the Resource Governor:






ALTER RESOURCE GOVERNOR WITH (CLASSIFIER_FUNCTION=dbo.fnsRsGovMgr);




And finally, the following code activates the configuration:






ALTER RESOURCE GOVERNOR RECONFIGURE;

Now, when a user connects to the SQL Server instance using Management Studio, their system resource utilization will be very limited.

Friday, November 30, 2007

Undetected Deadlocks in SQL Server

Every once in awhile, things that are hard to explain happen when running resource-intensive queries against SQL Server. Yesterday (and by extension, this morning) I had one of those happenstances.

There's a phenomenon in the SQL Server engine known as an "Undetected Deadlock". They have been around for awhile, and there have been numerous patches and hotfixes related to the problem.

Basically, these undetected deadlocks can happen in many places, but the most common is related to parallelism. For example, a query gets parallelized, then CXPacket waits occur, and then somehow the SQLOS CPU Scheduler gets "out of whack" and one or more CPUs forget that they're participating in a parallel query and try to move on to other things. In rare cases, this leads to a deadlock because not all CPUs are in sync with one another. At this point, all activity on the server comes to an end because of a "Suspended" process. Checking the SQL Server message log (I will still call it an error log!) results in the following:

 

SQLServerError

The only way out of this mess is to kill the process that caused it in the first place.. If you look closely at this error log, you'll see that the problem actually started at 8pm and wasn't resolved until 6am when I killed the process.

Wednesday, November 28, 2007

Visual Studio 2008 & SQL Server 2005

I made a pretty big mistake the other day. I installed Visual Studio 2008 Team Suite on my primary desktop workstation. Instead of doing the smart thing and installing it side by side with Visual Studio 2005, I uninstalled Visual Studio 2005, then installed 2008.

 

The main reason that this is a bad idea can be summed up in the following picture:

image

Because I uninstalled VS2005, I broke SQL Server Business Intelligence studio. Since most of the code I interact with on a regular basis is SQL Server related, this is a problem.

 

In order to solve the problem, I have to reinstall SQL Server 2005 Workstation components and then reapply SP2 for SQL Server (and the subsequent rollup patch). All in all a pain in the neck.

Don't make this same mistake!

Monday, November 19, 2007

WTF?? Or, more appropriately, Error Dialogs

My laptop has been giving me fits lately, so I decided to augment my image backup with a file backup using the Windows Vista Backup and Restore Center... About 30 minutes into the backup task, the following dialog appeared:

image

Now, I know I'm not the smartest guy in the world, but does this dialog make sense to ANYONE????

SQL Server 2008 "Katmai" CTP 5 New Features

Well, the long wait for Katmai CTP5 (The "November CTP") is finally over. Microsoft has released CTP5 and has posted the bits to download.microsoft.com  (Don't look for the bits on Connect just yet as they are not posted there)

New Features

There are a bunch of long-awaited features in CTP5, but here are some of my favorites:

Reporting Services Scale Engine and Robust Server Fit and Finish

A reengineered memory management and scalability infrastructure lays a solid foundation of scalability capability to enterprise customers.  

Transparent Data Encryption

SQL Server 2008 enables encryption of an entire database, data files, and log files, without the need for application changes. Encryption enables organizations to meet the demands of regulatory compliance and overall concern for data privacy. Some of the benefits of transparent data encryption include searching encrypted data using both range and fuzzy searches, searching secure data from unauthorized users, and data encryption. These can all be enabled without changing existing applications.  

Resource Governor - Limit Specification

Resource Governor allows organizations to define resource limits and priorities for different workloads, which enables concurrent workloads to provide consistent performance to end users. This enhancement specifically delivers the resource limit functionality.

Backup Compression

Keeping disk-based backups online is expensive and time consuming. With SQL Server 2008 backup compression, less storage is required to keep backups online and backups run significantly faster since less disk I/O is required.

Plan Freezing

SQL Server 2008 enables greater query performance stability and predictability by providing new functionality to lock down query plans, enabling organizations to promote stable query plans across hardware server replacements, server upgrades, and production deployments.

Fully Parallel Plans Scale on Partitioned Tables

Partitions enable organizations to manage large growing tables more effectively by transparently breaking them into manageable blocks of data. SQL Server 2008 builds on the advances of partitioning in SQL Server 2005 by improving the performance on large partitioned tables.

Data Collection and Performance Warehouse for Relational Engine

Performance tuning and troubleshooting are time-consuming tasks for the administrator. To provide actionable performance insights to administrators, SQL Server 2008 delivers more extensive performance data collection, a new centralized data repository for storing performance data and new tools for reporting and monitoring

Service Broker Enhancements

Getting the data to the right place at the right time is important.  Service Broker Conversation Priority in SQL Server 2008 gives you greater control over the system by making it easy to configure priority rules so that the most important data is sent first and processed first.

Registered Servers Enhancements

Enhancements to the Registered Servers tool window in Management Studio include running T-SQL queries and policies against groups of servers and the ability to share a common, centrally stored, server topology (Database Engine only).

Synchronous net-changes change tracking for SQL Server

SQL Change Tracking feature provides the functionality to synchronously track changes to data in user table without the need to create triggers or modify schema of the table. Applications will be able to reliably determine what data has changed since a watermark/baseline and will be able to obtain the latest data. The feature is geared towards providing the functionality with least DML overhead.

T-SQL IntelliSense

Transact-SQL IntelliSense provides intelligent aids for Transact SQL scripting that make language references easily accessible for database developers. When coding, you do not need to leave the Database Query Editor to perform searches on T-SQL language elements or your database metadata.  You can keep your context, find the information you need, insert T-SQL language elements directly into your code, and even have IntelliSense complete your typing for you. This can speed up software development by reducing the amount of keyboard input required and minimize references to external documentation.

Declarative Management Framework (DMF) Enhancements

Enhancements to DMF include more expressive conditions (including support for common functions, T-SQL, and WMI queries), more robust target set filtering, custom messages for policies, and the inclusion of best practice policies.

Geo-spatial Support

SQL Server 2008 delivers comprehensive geo-spatial support. The new GEOGRAPHY and GEOMETRY data types provide spatial data support for location-aware applications.  These types can be used to store locations, as well as paths and regions in space, and provide a rich set of functionality for comparing and manipulating these objects.  Use the GEOGRAPHY type when working with latitude and longitude coordinates in a true round-earth model; use GEOMETRY when working in projected planar surfaces, as well as naturally planar systems such as interior spaces.

These types are supported by new spatial indexes, which provide for fast execution of queries involving spatial data.  The query optimizer has been enhanced to build-in knowledge of spatial indexes and types, so that appropriate cost-based plan decisions can be made.

Analysis Services Query and Writeback Performance (FITS)

New MOLAP-enabled write-back capabilities in SQL Server 2008 Analysis Services remove the need to query ROLAP partitions. This provides users with enhanced writeback scenarios from within analytical applications without sacrificing the traditional OLAP performance.

Robust Report Server Platform

Reports can easily be delivered throughout the organization with simplified deployment and configuration. This enables users to easily create and share reports of any size and complexity.

Integration Services - Lookup Enhancements

The need to perform lookups is one of the most common extraction, transformation, and loading (ETL) operations. This is especially prevalent in data warehousing where fact records must use lookups to transform business keys to their corresponding surrogates. SSIS increases the performance of lookups to support the largest tables.

Analysis Services MDX Query Optimizer - Block Computation

Block computations provide a significant improvement in processing performance, enabling users to increase the depth of their hierarchies and complexity of the computations.

Analysis Services Aggregation Design

SQL Server 2008 drives broader analysis with enhanced analytical capabilities and with more complex computations and aggregations. The AS Aggregation Design improvement exposes Aggregation Design objects in SQL Server BI Dev Studio and SQL Server Management Studio and provides tools for users to better work with these aggregation designs. In addition, an advanced view in the new Aggregation Design tab of the cube editor provides the ability for an advanced user to view and manually edit individual aggregations within an aggregation design.

Analysis Services Cube Design

New cube design tools help users streamline the development of the analysis infrastructure, enabling them to build solutions for optimized performance. The AS Cube Design improvement introduces a new Cube Wizard which helps users create better cubes in fewer steps.  The new wizard focuses on having the user answer a few questions to create leaner cubes that better targets their needs.  It also unblocks the previously difficult scenarios of creating a cube a cube based on a single, de-normalized table and creating a cube containing only linked dimensions.

Tuesday, November 13, 2007

SharePointPedia

Ok, so the name is stupid, but I think the idea actually has some merit!

Microsoft has quietly been putting together a real, honest-to-goodness Internet-facing site that is 100% SharePoint. They call the site SharePointPedia, and it can be found here: http://sharepoint.microsoft.com/pedia/Pages/Home.aspx (actually that's what the DNS entry gets redirected to, the "real" URL is: http://sharepointpedia.com/)

The idea is to collect all things SharePoint related into a single location. Right now the site is a collection of articles and issues that have basically been culled from the MSDN Forums.

Thursday, November 8, 2007

SQL Server 2008 "Katmai" on Windows Server 2008 "Longhorn"

I have recently had the need to work with SQL Server 2008 installed on Windows Server 2008. Since both of these products are in Beta (one more "beta" than the other!), this can really be a challenge.

One of the first things I noticed after installing Katmai (CTP4 - the July CTP) on Longhorn (RC0++) was that you could not connect to SQL Server, even though the installation completed without error. When you attempt, the following happens:

image 

Digging into the error generated, the following TDS error is thrown:

image

 

After some digging around, I found the answer to this problem. It just so happens that the version of SqlClient is hosed on the default install of WS08/Katmai and you have to fix it. You do so by installing Visual Studio 2008 "Orcas" Beta 2!

 

There's a nice post on the Katmai forum that details what does and does not work together with Katmai:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2300504&SiteID=1

Microsoft PerformancePoint 2007 is Live!

Microsoft announced this morning that their new PerformancePoint 2007 is officially available. PerformancePoint is the next step in brining Business Intelligence information to everyone.

Evaluation editions are available as follows:

There's also new MSDN forums and other content available as follows:

We are happy to announce the Microsoft Forums for Microsoft PerformancePoint Server, ProClarity and Master Data Management. Learn more about these new Forums by reading the announcement on the New Microsoft PerformancePoint Server 2007 Forums. (https://connect.microsoft.com/content/content.aspx?ContentID=6853&SiteID=181)

Documentation and Training

Check out the Office Developer Portal (http://msdn2.microsoft.com/en-us/office/bb660518.aspx) and TechCenter (http://technet.microsoft.com/office/performancepoint/default.aspx) for deployment and operations guides, technical whitepapers and 'how-to' help and tips.

Finally, look for the PerformancePoint Server 2007 books (http://sharepoint/sites/BISL/Documents/BookPlan.pptx)and online training webcasts (http://www.msreadiness.com/TrainingSearch.aspx?prodid=0&keyword=performancepoint) , as they should help your prospects, customers and partners get started!

Friday, November 2, 2007

SQL Server Integration Services, Oracle Client and the 64-bit Platform

I was reminded today that I had promised some time ago to create a post about my experiences with SSIS, Oracle and 64-bit environments.

The current project that I am responsible for, Configuresoft's Configuration Intelligence Analytics, is built to extract data from a number of source systems, including those housed in an Oracle Database. One of our major Joint Development Partners has standardized on 64-bit operating systems for all tools, including our CIA product. As it turns out, many of our development environments are 64-bit as well, so from my perspective I figured this would be an easy scenario to support.

WRONG!

As it turns out, the Oracle client is very peculiar when installed into 64-bit environments. The installer will detect the fact that it is 64-bit native, and by default will only register 64-bit instances of itself. This is all well and good of course, *if* you are using 64-bit native components all through the communication channel (i.e, if the application you're building is 64-bit). The unfortunate thing is that the SSIS development environment is a 32-bit application. So, this means that when you initially install a 64-bit Oracle client, you actually cannot use Business Intelligence Development Studio (BIDS) or Visual Studio in order to test (or even validate) the Oracle-facing SSIS components. Fortunately, Oracle has thought that through and provided the ability to install a 32-bit version of the client components. Again, all well and good, except for the fact that by default, when you use DTEXEC to execute the SSIS packages (or use a SQLAgent job), you're actually running a native 64-bit application. ARRRRRRRRRRRRRRGH! So, you test and validate the package in BIDS and it succeeds, but fails when you run for "production" purposes.

To make matters worse, even if you properly install and register the Oracle client (I'm specifically talking about the Oracle 10 client) you'll still have problems due to the fact that there is a bug in the 10.2.x.x Oracle client that precludes the use of many special characters, either in a username or program path (WTF?!?!) that uses the client. The bug is Oracle bug # 3807408 and it is fixed in Oracle patch # 5383042. But wait, there's more!! The path is valid only for the 10.2.0.2 Oracle client, which CANNOT be downloaded anywhere. You must patch a 10.2.0.1 client to get the 10.2.0.2 version. This patch is available (oh by the way, it's only available for 32-bit environments!) as Oracle patch # 4547817.

Oh, and did I mention that you have to pay Oracle for the patches?

Did I say ARRRRRRRRRRRRRRRRRRRRRRRRRRRRRGH! yet?

By now I'm sure (assuming you've read this far) that you're wishing I'd just cut to the chase...

 

Oracle and SSIS 64-bit Happiness

The following steps are in my opinion the only steps you can take to get SSIS and Oracle to play well in 64-bit environments:

  1. Install the Oracle 10.2.0.1 32-bit client (This is the client that is available from Oracle's website)
  2. Install the Oracle # 4547817 patch (which will upgrade your client to 10.2.0.2)
  3. Install the Oracle # 5383042 patch (fixes Oracle bug # 3807408)
  4. Develop your SSIS Packages
  5. When ready to execute the package, do NOT use a SQLAgent SSIS Package task to execute them. Use a CMDExec task that explicitly uses the 32-bit version of DTEXEC (located in Program Files (x86)\Microsoft SQL Server\90\DTS\binn)

The above steps will work 100% of the time. However, you will lose the 64-bit execution environment benefits (better memory management basically) in your SSIS packages. You may want to consider breaking out any package that requires Oracle Connectivity to run as a separate job, that way you can limit the impact of the 32-bit only environment.

Thursday, November 1, 2007

SQL Server Integration Services Package Manager Utility

Joey Demaio's dear friend (and my colleague) Matthew Roche has recently released his SSIS Package Manager Utility (known as PacMan) into the wild on CodePlex.

This tool has been incredibly useful for my team at Configuresoft and I hope it will be useful to you too. If you develop packages in SSIS, you most certainly need this tool!

PACMan

Using witexport.exe to Generate a TFS WorkItem Type

The "WorkItem" is a very important part of any collaborative development effort. Basically the WorkItem represents a specific task that must be completed by someone in order for the project to move forward. When working with Visual Studio Team System and Team Foundation Server  you will notice that each project has a specific guidance template associated with it. In the case of an out of the box installation, you have the MSF for Agile template and the MSF for CMMI template. Generally speaking, these are good enough for most projects. There are a number of add-on templates available, and Microsoft provides a Process Template Editor (part of the Power Toys now) that allows you to create your own.

In my current project, we are primarily using MSF 4.2 for Agile as our guiding template, but have a need to track both Requirements and Scrum-like sprints. The "Requirement" WorkItem type is included in the MSF for CMMI template that ships with TFS, and the "Sprint" WorkItem type is included with the downloadable eScrum template. The only way to combine these WorkItem types is to extract them from their respective templates and in my case,  import them into the Agile Template.

Exporting a WorkItem Type

Before you can export a WorkItem type, you need to ensure that you have a project created that contains the Type you wish to export. It's probably a good idea to create a project on a development server (Or a sandbox server if you have one available) so that you can keep them around as necessary.

In order to export a WorkItem Type, you must use the witexport.exe utility, which is located in the <install drive>\Program Files\Visual Studio8\Common7\IDE folder. It is actually a very simple utility to use. In my case, the server that I want to connect to is named DEVCIATFS and the project is named CSIDemoeScrum (using the eScrum template). In order to extract the "Sprint Details" WorkItem type, run the witexport.exe utility as follows:

image

(In case you can't read the capture above, the command line is witexport.exe /f SprintDetails.xml /t DEVCIATFS /p CSIDemoeScrum /n "eScrum Sprint Details" )

The result is a fully-formed XML file that contains the definition for a Sprint Details WorkItem that when imported into TFS looks like this:

image

Thursday, October 25, 2007

Agile Development and Requirements Tracking

I'm currently in the process of building a business-case for wider-adoption of Visual Studio Team System and Team Foundation Server within our company. During this process, I've come to learn a couple of things:

  1. Agile Development ROCKS - I've stated before that I believe any Business Intelligence project is going to be best served by an agile process, and since my life over the last year has been deeply rooted in business intelligence, I've been living this. Having had to stick my head up over the wall and take stock of what others are doing, I can only say that I am so happy that I'm doing what I am... :)
  2. Agile is not for everyone! - This is a well-known fact, but it bears repeating here. Many development teams have their lives dictated to them by a series of requirements and are used to functioning in that manner. They tend to want to know exactly what they're building when they start.

One of the big differences between Agile projects and more traditional methods is the way in which requirements are gathered and tracked. In an Agile project, requirements are thought of in terms of business scenarios, which can be related roughly to use-cases. While in the requirements phase of an Agile project, Product and Project Managers spend the bulk of their time developing the scenario. The process generally looks like this:

  1. Product Manager interviews potential customers to determine basic functionality. Product Manager compiles customer feedback into a series of "things" that they want to accomplish with the product.
  2. Product Manager writes a business scenario for each of the high-level "things" that the customer wants to accomplish.
  3. Project Manager examines each business scenario and gathers input from the team on how difficult it is.
  4. Project Manager takes a SWAG (if you haven't heard this term before, WikiPedia it - second bullet under initialism) as to how long each scenario will take.
  5. Product and Project Managers negotiate a rough development schedule that everyone agrees will change. (This bullet is by far the most important one in the process and is the number one cause in my opinion as to why Agile isn't for everyone!)

For this reason, the Microsoft Solutions Framework (MSF) for Agile process template that is included with Team Foundation Server does not include a "Requirement" Work Item type. Over the next few days, I'm going to be spending time adding the CMMI "Requirement" WorkItem type to the Agile template and customizing the linkage between a scenario and a requirement.

Wednesday, October 24, 2007

Visual Studio Team System Resources

I was doing a bit of research today and hit Anton Delsink's blog and found this very nice mind map of Visual Studio Team System resources. If you are looking for something that has to do with VSTS, this is what you need!

Tuesday, October 23, 2007

Managing Documentation Projects in TFS

The Work Item Tracking (WIT), Tools and Reporting team for Microsoft Visual Studio Team System has created a blog post on their experiences using TFS to track documentation (as opposed to code) projects. Interesting stuff. Check it out here: http://blogs.msdn.com/teams_wit_tools/archive/2007/10/16/managing-documentation-projects-in-team-foundation-server-part-1-planning-the-sprint.aspx

Team Development with Team Foundation Server

The Microsoft Patterns and Practices team has recently released the "Team Development with Visual Studio Team Foundation Server" guide. It's chock-full of great information to get you started with TFS.

There's so much content and extra "stuff" available for Team Foundation Server, so it really shouldn't surprise me when I come across something that I've not seen before. What's irritating about *this* particular guide however, is that it's something that I really could have put to serious use over the last couple of months...

There is a page on CodePlex that describes the guide here: http://www.codeplex.com/TFSGuide

Oh well.

image

Monday, October 22, 2007

Server Requirements for Team Foundation Server 2008

As anyone who knows me has come to realize, I'm a huge fan of Microsoft Team Foundation Server. I can't really explain in a simple post why it is that I like it so much, but I can say that most of it comes back to the fact that the TFS toolset is almost invisible. I don't have to go out of my way to interact with source control or the project portal, it's just there when I need it...

With TFS 2008, Microsoft has improved performance and operation to the point that they have come out with a new set of hardware recommendations. Brian Harry has blogged about the requirements here:http://blogs.msdn.com/bharry/archive/2007/10/18/tfs-2008-system-recommendations.aspx

Friday, October 12, 2007

Agile Project Management

One of the tenets of agile Programming is "working software", which can be interpreted any number of ways, but always boils down to the fact that agile developers should plan on delivering a "shippable" build more frequently than developers using traditional methods. This actually puts a lot of focus on the agile project manager, because they need to more closely monitor both the quality of code as well as the coupling of code to other systems and services. In my experience, most project managers tend to look at agile development methodologies and classify them as valid for small projects. They tend to ignore them for larger projects, which in my opinion is a huge mistake.

Service Oriented Architectures and Agile Projects

When Service Oriented Architecture (SOA) started becoming the buzzword in the software industry, developers began to look at their products and made a concerted effort to break the components into building blocks that were all disconnected from one another yet functioned together as a whole. This process fits well with the object oriented world, and most development toolkits and patterns take SOA for granted these days. In my opinion this same transformation needs to occur in the Project Management world. Project Managers need to do the same thing with their projects. Break them down into sub-projects and then look at each of those as a candidate for using agile development methodologies.

Decoupling and the Agile Manifesto

Remember that the Agile Manifesto states that customer collaboration should be valued more than contract negotiation (which in this context means that you let the customer drive your requirements as opposed to developing to a strict set or predefined ones). This is very hard for many project managers to accept, but can be very helpful when trying to decompose, or decouple larger projects into smaller, more agile ones. Take for example a highly-coupled software system (say that it performs a configuration management function) that has a 1 year development cycle for each release. Trying to switch the entire project to an agile-based methodology would be doomed to failure. Decoupling both the software components and the projects around them however stands a fair chance for working well in the Agile world.

Thursday, October 11, 2007

MDX Studio CTP Release

One of the issues that we face in the Microsoft-centric Business Intelligence world is the fact that the tools (From the client stack all the way to the development environment) make things relatively easy for the "happy path" (That place where most of the people want to go) but any time you step off that path, you really have to interact with the MultiDimensional Query eXpressions (MDX) language. To say that MDX is complicated would be tantamount to saying that the Grand Canyon is deep.

Fortunately, Mosha Pasumansky (The "father" of MDX) understands this, and has recently released a tool to assist developers that need to interact with MDX. The tool is called, "MDX Studio" (imagine that!) and is available for free download from the new Microsoft Live skydrive service here: http://cid-74f04d1ea28ece4e.skydrive.live.com/browse.aspx/MDXStudio/v0.1

Here's a screen shot of MDX studio running on my laptop:

image

Wednesday, October 10, 2007

SQL Server Reporting Services and SharePoint Integration video

The geekSpeak session that I did last month has been posted to Channel 9. You can view it here:

 

http://channel9.msdn.com/shows/geekSpeak

 

geekSpeak - SQL Server Reporting Service with Ted Malone

Listen in on this geekSpeak where expert Ted Malone shares his real-world experience implementing SQL Server Reporting Services and more.  His company Configuresoft has created a a product built out around all of Microsoft's BI and collaboration tools. 
Ted in an architect and developer who works with SharePoint, SQL Server Analysis Services, SQL Service Integration Services and, of course, SQL Server Reporting Services. 
Co-hosts Glen Gordon and G. Andrew Duthie get listener questions to Ted.  They also engage in a useful discussion of 'What is BI?'
Demos and discussion also includes the what, why, when and where of SharePoint and SQL Server Reporting services integration, data and collaborative tools all working together.  His talk also includes information about the BDC, or the Business Data Catalog.

Visual Studio Team System - DBPRO Power Tools

I have been very busy over the last few months in the Business Intelligence world, so I've been sort of neglecting the work that my friends over on the Visual Studio Team System team have been doing (Fortunately there's no good technical editors that read my blog or I'd probably get dinged for that last sentence!). Back in August, the DBPRo team released the Microsoft Visual Studio Team System for Database Developers Power Tools, which includes some very nice functionality for the Database developer who wants to move into the Agile world.

Chief among the features that are added to DBPRO with the power tools is something I've been harping about for a LONG time, and that's the ability to build a dependency graph directly in the development environment. This is especially helpful when you're trying to track down the impact of any refactoring changes.

Other tools delivered with this add-on are:

Code Analysis

• Static Code Analysis - A precursor to the functionality that will be in future versions of VSTS that will allow you to perform Static Code Analysis on T-SQL code.

Refactoring


• “Move Schema” Refactoring - Allows a user to right click on an object and move it to a different but existing schema
• SP Rename Generation - Generate a new script that will contain sp_renames for all rename refactored objects that the user can then execute.
• Wildcard Expansion - Automatically expand the wildcard in a select to the appropriate columns.
• Fully-Qualified Name Support - Automatically inject fully-qualified names when absent in a script
• Refactoring extended to Dataset - Refactor into strongly typed dataset definitions

MSBuild Tasks


• Data / Schema Compare Build Tasks - MSBuild tasks that can generate scripts as if the user had run the Data / Schema compare UI

Schema View

• API Access to Schema View - Insert / Update / Delete to schema View and list schema objects and their associated files

Miscellaneous Tools

• Script Preprocessor - Expand SQLCMD variables and include files and command line version (sqlspp.exe) & an MSBuild version ( wraps the command line version )

Changes to Microsoft Office SharePoint Server 2007 Licensing

Well, this apparently escaped my notice last month.. Microsoft has quietly relaxed some of the restrictions on MOSS 2007 licensing. One of the things that larger customers were faced with was the fact that you could not legally include both internal (intranet) and external (Internet) content on the same SharePoint farm. Well, according to the License FAQ, now you can:

 

Accommodation for simultaneous use of server software under Office SharePoint Server 2007 and Office SharePoint Server 2007 for Internet sites:

The same software is licensed under Office SharePoint Server 2007 and Office SharePoint Server 2007 for Internet sites under different use rights.  Office SharePoint Server 2007's use rights support private intranet sites and require CALs for licensed access, while Office SharePoint Server 2007 for Internet Sites does not require CALs, but does require that all content, information and applications be accessible through the internet to non-employees.  Please refer to the Product Use Rights (PUR) document for these products' use rights.

As an accommodation for possible deployment scenarios, customers wishing to consolidate their SharePoint needs under a single deployment may acquire licenses for both products, assign those licenses to the same server, and use the same running instance of the software simultaneously under both licenses.  However, customers must acquire CALs as required under the Office SharePoint Server 2007 use rights for users and devices accessing content in any manner not permitted under the Office SharePoint Server 2007 for Internet sites use rights.  

Tuesday, October 9, 2007

SharePoint 2007 sizing tool by HP

One of the biggest challenges faced by organizations that are deploying the latest generation of server-based products from Microsoft (SharePoint, Exchange, SQL Server, SMS, etc) is that Microsoft appears to have the idea that hardware is cheap and their customers are willing to spend whatever is necessary to deploy the latest and greatest.. Back in the "old days" (you know, when 64K RAM was all you were ever going to need) we focused on memory and CPU speed. I remember teaching SQL classes where I'd comment that you treated SQL server just like an airplane, "If it doesn't fly fast enough, just put a bigger engine on it". (Of course that was always said in jest, but unfortunately it was a strategy that was employed all too often)

These days, it's all about matching the IO subsystem to the application that you're running on the server. Memory and CPU speed are *almost* a byproduct. When customers purchase our product, we spend more time educating them on the IO subsystem that any other aspect of the hardware acquisition.

Well, it appears that HP is recognizing this trend as well, and (to give them credit) they're trying to stay out in front by offering a new tool to assist organizations that are about to deploy SharePoint 2007. The tool, which is available from HP as a free download (but does require registration), is pretty cool and does a good job.

Installing and Running the Sizing Tool

After you download the tool from HP, unzip the executable and run it. (If you're running Vista with UAC enabled, you'll need to answer the annoying UAC prompt). The tool takes a few moments to install, and then you'll be prompted to install the StorageWorks IO sizing tool as shown below:

image

Once the tools are installed, you execute them from the program files shortcut, which will launch the "sizer home" as shown below:

image

Basically the tool is a wizard-based tool that asks a few questions and then uses your answers to figure out the best deployment strategy. In my case (selecting a typical use-case for our Configuration Intelligence Analytics solution) the answer is:

image

Once I agree that a single-server solution is the right one for my SharePoint deployment, the tool calculates the total cost and generates a "pick list" for the server components as shown here:

 

image

Nice to see that the hardware vendors are picking up on the fact that people really do need to spec out the entire server, not just the CPU/memory configurations.

Wednesday, October 3, 2007

Big news from Microsoft - aka WOOT!

This is just fantastic news. Microsoft announced today that they will be releasing the source code libraries for .NET 3.5, *and* they'll be allowing integrated VS 2008 debugging on the framework itself..

Simply put, WOW

http://weblogs.asp.net/scottgu/archive/2007/10/03/releasing-the-source-code-for-the-net-framework-libraries.aspx

Sunday, September 30, 2007

Agile Business Intelligence

I was reading through the blogosphere his morning and came across some interesting observations on Agile BI. Basically, the author threw out the question, "what would our children turn out like if we raised them like we create BI systems?" The authors contention is that most people treat BI like a protected entity until it is absolutely ready for consumption, and even then rolls it out in a very limited fashion.

This got me to thinking about what Agile really means in the context of BI systems. The way I see it, Agile and BI are so naturally complimentary that you almost have to embrace agile in order to build and deploy a successful BI system. BI systems are all about condensing data into usable information then helping the consumer to develop that information into workable knowledge. Everyone who's ever built a BI system knows that the best measure of success in BI is how well that knowledge is used to close the loop and affect change on the processes that the BI system measures. If this doesn't scream "agile, iterative development" I really don't know what does. Consider the following tenets of agile programming:

  • Individual Interactions over Processes and Tools
  • Working Software over comprehensive documentation
  • Customer Collaboration over Contract Negotiation
  • Responding to change over following a plan

Now consider how they apply to a BI project:

  • It doesn't matter what you use to build the product, it's all about understanding the requirements and working towards a system that fulfills those requirements.
  • The system is going to change many times throughout the development lifecycle. Building a system that actually works and provides the output that the consumer needs is much, much more important that providing a manual and weeks of training.
  • Understanding what the customers need is more important that telling them what you're going to build.
  • See the second bullet. The system is going to change. The information needed to form knowledge will change. The desired output will change. The loop that you're trying to close will change. Everything changes.

 

I always tell my team that it is OK to be uncomfortable during the development cycle. As a matter of fact, I try and make sure that they ARE uncomfortable because that tells me that they're pushing the envelope. Waterfall methodologies just don't work in a BI world, and while agile can lead to discomfort, I really believe that the end result is well worth it.

Saturday, September 29, 2007

Configuration Intelligence Analytics - aka Really Cool Stuff!

I don't spend a lot of time talking about the products and services that my company, Configuresoft, offers in this blog, mainly because I wanted it to be more of a repository for technical information and things that might help others. However, sometimes things are just too cool not to mention.

In my role as Product Strategy Architect, I'm responsible for our Configuration Intelligence Analytics product. We recently went "Gold" on version 1.0 of the product. This is a product that could very easily revolutionize the systems management world (yes, I'm biased so it's easy for me to say that). We've take a bunch of leading and bleeding-edge products from Microsoft, applied time-tested Business Intelligence methodologies and delivered a product that marries an organizations Configuration Management data with operational data from their Service Desk. The result is in my not- very-humble-at-all opinion is one kick-ass product. I was onsite with a customer this week to help deploy the tool into their environment, and it was really awe-inspiring for me. Have I mentioned lately how much I truly love my job?

PortalEntryPage

SANs and SQL Server Performance

Those that know me know that I spend a fair amount of time in the world of SQL Server internals and performance tuning. Over the last year I haven't been able to spend much time in that space due to other project priorities. It's good to see though that there's more and more information being published/blogged/etc. About SQL Perf-tuning.

Here's a really great article by Linchi Shea over on SQLBlog that caught my eye (Yes Chris, I'm aware you've blogged about it first!).

The best thing about this article is that it really isn't a "SANs are bad because...." type article, but rather gives you examples of things to test in your particular environment.

Monday, September 24, 2007

Exception Handling in WCF

I have recently had the pleasure (yes, it really is a pleasure!) to work a bit closer with the new .NET technologies such as Windows Communications Foundation (WCF). One of the things that I've found myself struggling with is some of the more esoteric aspects of precedence in exception handling. (Yeah, I know, this is basic stuff, but I was surprised at how I got caught with this and figure maybe there's one or two more out there that can benefit from my bumbling)

Consider the following code snippet (from MSDN):

using System;
using System.ServiceModel;
using System.ServiceModel.Channels;
using Microsoft.WCF.Documentation;

public class Client
{
public static void Main()
{
// Picks up configuration from the configuration file.
SampleServiceClient wcfClient = new SampleServiceClient();
try
{
// Making calls.
Console.WriteLine("Enter the greeting to send: ");
string greeting = Console.ReadLine();
Console.WriteLine("The service responded: " + wcfClient.SampleMethod(greeting));
Console.WriteLine("Press ENTER to exit:");
Console.ReadLine();
}
catch (TimeoutException timeProblem)
{
Console.WriteLine("The service operation timed out. " + timeProblem.Message);
wcfClient.Abort();
Console.ReadLine();
}
// Catch the contractually specified SOAP fault raised here as an exception.
catch (FaultException<GreetingFault> greetingFault)
{
Console.WriteLine(greetingFault.Detail.Message);
Console.Read();
wcfClient.Abort();
}
// Catch unrecognized faults. This handler receives exceptions thrown by WCF
// services when ServiceDebugBehavior.IncludeExceptionDetailInFaults
// is set to true.
catch (FaultException faultEx)
{
Console.WriteLine("An unknown exception was received. "
+ faultEx.Message
+ faultEx.StackTrace
);
Console.Read();
wcfClient.Abort();
}
// Standard communication fault handler.
catch (CommunicationException commProblem)
{
Console.WriteLine("There was a communication problem. " + commProblem.Message + commProblem.StackTrace);
Console.Read();
wcfClient.Abort();
}
}
}

Notice that the generic CommunicationException is handled as the last item in the catch block. The reason for this is that FaultException is actually derived from CommunicationException. This causes any FaultExceptions to be caught by the CommunicationException handler...

What this means practically is that you must take care and order your exception handlers properly when dealing with these types of Exceptions. Generally speaking, when writing WCF-based code, you'll want to structure your exception handlers similar to the sample above.

Friday, September 14, 2007

The moment of "Duh"!

During my GeekSpeak session on Wednesday, I was asked to provide a high-level overview of Business Intelligence. I provided what I thought was a decent answer, had a couple of follow-on statements, and Glen chimed in with some info as well.. Sounds good, right???? Well, here it is Friday now, and i was just going through the MSDN SharePoint Forums on BI when it hit me, "Duh!" Why didn't you tout Lynn's book on the subject????

So, better late than never. Lynn Langit has written a fantastic business intelligence overview book.

Thursday, September 13, 2007

Yes, he really did beat me!

It was pointed out to me by my friend and colleague Chris Randall (Who by the way is an excellent SQL trainer, if you ever need to take a SQL Class, you should look him up at Ameriteach in Denver) that he blogged about the SQL Internals Viewer in this blog post well before I blogged about it...

So yes, it has been proven that there is at least person who's more geeky than I when it comes to SQL Server guts!

SharePoint 2007 "Farm" Sizing

Yesterday, during my GeekSpeak webcast (Which was fun by the way) the question came up as to how many SSRS instances could be supported per SharePoint farm. I kind of stumbled through the question and didn't have a very good answer, so I spent a little bit of time yesterday researching the topic. Unfortunately I don't have a much better answer to post here today, but one thing that comes to mind is that there is a clear statement from Microsoft that there should be 1 Database Server for every 8 Web Servers in a farm. (See the Microsoft SharePoint sizing recommendations here: http://technet2.microsoft.com/Office/en-us/library/6a13cd9f-4b44-40d6-85aa-c70a8e5c34fe1033.mspx?mfr=true )

So, using the ideas in the document above, and understanding that there can only be 1 SSRS instance for each Web Server, and "taking a bit off the top" for the overhead on the database server that an SSRS instance will generate, I'd say you should modify the recommendations slightly and say that you should limit SSRS integration to 5 instances of SSRS per Database Server in the farm. (This is in no way an official limitation, it's simply conservative extrapolation on my part)

Wednesday, September 12, 2007

SQL Server Storage Engine Internals

I'm a geek. Yes, it's true.. No question about it.. As mentioned in earlier posts, I've written articles and book chapters dedicated to the internal workings of various SQL Server components.. Well, it appears that I'm not the only geek out there with an interest towards SQL Server internals.. Turns out there's a pretty good community of us actually.. :)

Anyway, check this site out: http://www.sqlinternalsviewer.com/ basically it's a tool that let's you see exactly how the SQL Server storage engine is actually storing your data.. Way cool!

Friday, August 31, 2007

Visual Studio Rebranding

Now that Visual Studio 2008 (codename "Orcas") is getting close, Microsoft has decided to rebrand the suite to make it less confusing. Rob Caron has posted details on his blog here: http://blogs.msdn.com/robcaron/archive/2007/07/26/4069573.aspx

Monday, August 27, 2007

SQL Server 2005 Cumulative Update 3

In keeping with their plan to release regular hotfix rollups for SQL Server 2005, the SP2 Hotfix Rollup 3 package has been released. See the article here:

http://support.microsoft.com/kb/939537

Tuesday, July 31, 2007

Team Foundation Server Administration

Codeplex has a very nice administration tool for VS Team Foundation Server. It really does take the work out of managing permissions in the several locations required when you work with TFS. Check it out here: http://www.codeplex.com/Wiki/View.aspx?ProjectName=TFSAdmin

Monday, July 30, 2007

Visual Studio Team System - Process Guidance

As anyone who's used VSTS and tried to customize the process templates knows, it is not an easy task. Some of the VSTS team have been trying to help us with this and the result is now posted on Codeplex. Check out the Patterns and Practices VSTS Process Guidance Generator on Codeplex here: http://www.codeplex.com/process/Release/ProjectReleases.aspx?ReleaseId=5626

Sunday, July 29, 2007

Geek Speak on MSDN

I will be the guest speaker on the Microsoft Developer Network (MSDN) geekSpeak series on September 12!

I'll be speaking on the integration of SQL Server Reporting Services and SharePoint 2007.

Check it out!

Friday, July 27, 2007

New Article on Serverside.net

I was recently interviewed by George Lawton from theserverside.net. Read the article here:

http://blogs.msdn.com/briankel/archive/2007/07/26/dbpro-success-story-on-theserverside-net.aspx

Tuesday, July 3, 2007

Finally saw the book!

Back towards the beginning of the year, I was asked to help out on a book project with a couple of people I know and respect. The book is a training kit for the new MCITP 70-442 exam from Microsoft. I had never written a training kit before, and I have to say that working with MSPress and GrandMasters was one of the best writing experiences I have ever had. I will write for them anytime!

Anyway, the book title is "Designing and Optimizing Data Access by using Microsoft SQL Server 2005 Self-Paced Training Kit" and the ISBN is 978-0-7356-2383-5

Thursday, June 14, 2007

eScrum released!

Microsoft has finally released their heretofore "internal only" Scrum management tool known as eScrum. Basically it's a series of templates and guidance for Team Foundation Server (TFS) that allows project teams to keep track of project tasks using the Scrum methodology.

It's good to see that Microsoft is embracing agile methodologies, and it will be especially nice when the tools (Project/VSTS/TFS/etc) catch up with the vision...

Read more about eScrum here.

See the blog announcement here.

Download the templates here.

Wednesday, June 13, 2007

CA Erwin and DataDude

Seems like there's a lot of news in the industry that I have been missing lately!!! I just noticed that CA has announced that Erwin Modeler 7.2 is now capable of connecting directly to DataDude (Visual Studio Team System for Database Developers) projects.. This is GREAT NEWS for those who have a significant investment in Erwin. (I won't name names, but I know a developer who is very hardheaded about his particular toolset, so it is great to see others embracing the DataDude technology)

Check it out here: http://ca.com/us/content/campaign.aspx?cid=144449

How did I miss this?

Seems like Microsoft has purchased the intellectual property behind SoftArtisans Officewriter product. If you've never used Officewriter, you may wonder what the big deal is, however if you've used it, you know that it really does make authoring SSRS reports easier. Now it would appear Microsoft is going to bring that technology in house and fold it into a future release of SSRS.

This will be really big news for the "self service" business intelligence community!!

Full details are found here: http://www.microsoft.com/presspass/press/2007/may07/05-09BINewDayPR.mspx

Tuesday, June 12, 2007

New Visual Studio Team System Connector!

The Microsoft Visual Studio Team System (VSTS) and Project Server (PS) teams have collaborated on a new connector between VSTS and PS that allows bidirectional sharing of data. This is very useful for those who are using Project to manage VSTS development.

Read about it here: http://blogs.msdn.com/project/archive/2007/06/11/new-connector-released.aspx

Monday, June 11, 2007

Integrating SSRS with SharePoint whitepaper

Microsoft has published a WhitePaper on the integration of SSRS and SharePoint in SQL Server 2005 SP2. (There are a few other things included in this whitepaper as well)

Check it out here.

Saturday, June 9, 2007

TechEd 2007 is finished!

After 2 weeks of some pretty hard work, TechEd 2007 has finally come to an end for me and I'm heading home to Colorado. It was a great time, and more importantly, a ground-breaking experience in the Hands On Labs area. We launched more than 18,000 labs (HOLY COW!) and had some amazing customer satisfaction scores. (I don't think I'm allowed to post them here, but suffice it to say that they were amazing!)

It was an honor to be part of the show, and I truly hope that next year brings more of the same in terms of the HOL area, although I really don't see how this year can be topped.

Wednesday, June 6, 2007

Hands on Labs - SOA13 - Windows Workflow

One of the more popular labs here at TechEd 2007 in the SOA (Service Oriented Architecture) track is SOA13, Microsoft Windows Workflow Foundation Introductory Lab. This lab assumes that the attendee has no knowledge of Windows Workflow, but immediately leads them down the path to creating amazing and useful workflow extensions.

If you're here at TechEd, stop by the HOL North and give it a spin, you will not be disappointed!

WOW!!! - Hands on Labs at TechEd 2007

Here at TechEd 2007 in Orlando, I am totally amazed by some of the statistics here in the Hands On Labs area. It's Wednesday afternoon, and we've already had 13,000 labs launched!! Think about that for a minute, that's 13 THOUSAND Virtual Machines (actually it's more than that because some labs have more than 1 VM) started and interacted with in a 3 day period!!!!

Given that our overall customer satisfaction scores seem to be very high at the moment, I'd say that is an amazing accomplishment!

Tuesday, June 5, 2007

Hands on Lab - BIN02 - SSIS

Continuing the saga of interesting labs here at TechEd, check out BIN02 - Executing Common ETL tasks with Microsoft SQL Server 2005 Integration Services.

This lab is a very good lab that walks the user through a common ETL (Extract, Transform and Load) scenario using SQL Server 2005 Integration Services (SSIS). This lab is very well done, and is also very popular.

If you're here at TechEd, stop by the north HOL and check out BIN02.

Monday, June 4, 2007

Hands on Lab DEV20 - LINQ

Here at TechEd, one of the more interesting labs in the DEV area (HOL North) is DEV20, Mapping objects to Database Tables with LINQ to SQL. This is an amazing lab that walks through some of the new features of LINQ that will be released with the .NET 3.5 Framework.

LINQ, which stands for Language INtegrated Query, is a new extension that essentially removes (that may be a bit too strong) the problems associated with the impedence mismatch between relational database storage/access and object-oriented programming.

More information can be found about LINQ here.

If you are here at TechEd, stop by the HOL North and take a look at DEV20, if you are a database geek you will not be disappointed!

Sunday, June 3, 2007

Internet Access and the Rosen Plaza


In two words, IT SUCKS!


Fortunately for me, I have an integrated Verizon Wireless broadband modem in my laptop which does a great job staying connected... Since I have been doing so much work for the pre-show logistics here at TechEd, I haven't really needed my laptop, but I did just look at the statistics for the modem and noticed that I had been continuously connected for more than 110 hours. (Thank GOD for unlimited data plans!)

Thursday, May 31, 2007

HOL SOA21 - WCF Contracts

As mentioned earlier, I am in the process of QA testing some of the Hands On Labs for TechEd 2007. I am currently working on SOA21, which is a very good Windows Communication Foundation (WCF) lab. Unfortunately, it expects the attendee to have a good understanding of .NET 3.0 concepts as well as a good grasp of C#, so it may be somewhat tough for people to get through.

In exercise 2 Task 1, you are asked to rebuild the IMath interface, and you need to ensure that you use the new data contract that was constructed. The complete syntax for this is:

public class MathService : MathTypes.IMath
{
public MathTypes.MathResponse Add(MathTypes.MathRequest req)
{
return new MathTypes.MathResponse(req.x + req.y);
}
}

HOL SOA19 - Workflow Enabled Services - Sample Code

I am spending this week doing some QA on the Hands on Labs for TechEd. I will be posting additional info about the labs as I go through them. One thing I have noticed is every once in awhile the sample code is actually missing from the lab. For example, SOA19 has you create a rather lengthy class to assist the lab. Here is the missing sample code:

public static string NumToStr(decimal inputValue)
{
string[] magnitudeNames = { "", "Thousand", "Million" };
string returnValue = string.Empty;

if (inputValue > Decimal.MaxValue )
{
throw new ArgumentOutOfRangeException(
"inputValue", "The input value is too large.");
}

inputValue = Math.Abs(inputValue);
long dollars = System.Convert.ToInt64(Math.Floor(inputValue));
int cents = System.Convert.ToInt32((inputValue - dollars) * 100);

if (dollars > 0)
{
int tempDollars = 0;
int groupIndex = 0;
do
{
tempDollars = System.Convert.ToInt32(dollars % 1000);
dollars = System.Convert.ToInt64(
(Math.Floor((decimal)(dollars / 1000))));

if (tempDollars != 0)
{
returnValue = String.Format("{0} {1} {2}",
HandleGroup(tempDollars),
magnitudeNames[groupIndex],
returnValue);
}
groupIndex++;
}
while (dollars > 0);
returnValue = string.Format(
"{0} and {1}/100", returnValue.TrimEnd(), cents);
}
return returnValue;
}

private static string HandleGroup(int valueToConvert)
{
string[] onesNames =
{ "", "One", "Two", "Three", "Four", "Five",
"Six", "Seven", "Eight", "Nine", "Ten",
"Eleven", "Twelve", "Thirteen",
"Fourteen", "Fifteen", "Sixteen", "Seventeen",
"Eighteen", "Nineteen", "Twenty" };

string[] tensNames = { "", "", "Twenty",
"Thirty", "Forty", "Fifty", "Sixty",
"Seventy", "Eighty", "Ninety" };
string result = string.Empty;

int digit = valueToConvert / 100;
valueToConvert %= 100;
if (digit > 0)
{
result = onesNames[digit] + " Hundred";
}

int selectVal = valueToConvert;
if (((1 <= selectVal) && (selectVal <= 20)))
{
result += onesNames[valueToConvert];
}
else if (((21 <= selectVal) && (selectVal <= 99)))
{
digit = valueToConvert / 10;
valueToConvert %= 10;

if (digit > 0)
{
result = string.Format(
"{0} {1}", result, tensNames[digit]);
}

if (valueToConvert > 0)
{
result = string.Format(
"{0}-{1}", result, onesNames[valueToConvert]);
}
}
return result;
}

Saturday, May 26, 2007

Off to TechEd 2007!

Yes, the title and date are correct. I am off to TechEd a week early because I am privileged to work the show as a Technical Learning Guide (TLG) lead.

If you're going to be at TechEd, stop by the DEV pod in Hands on Labs (HOL) area and say hello..

Monday, May 21, 2007

Analysis Services 2005 Performance Tuning

Microsoft has published an update to the very well done Analysis Services 2000 performance tuning guide for Analysis Services 2005. This guide basically walks you through all of the things that you need to consider when working with large scale AS applications. The document is 90 pages, but well worth the read. Download it here.

Thursday, May 17, 2007

MDX Script Performance Analyzer

Chris Webb has recently posted a tool on Codeplex that should really help ANYONE who is trying to performance tune MDX queries. (Holy COW is that me right now!). The MDX Script Performance Analyzer can be found here.

In a nutshell, what this tool does is tell you what the most expensive calculations in your MDX script are.

This is a great tool!

Wednesday, May 16, 2007

SharePoint 2007 Performance

I have recently spent a lot of time working with Microsoft Office SharePoint Server (MOSS) 2007 and Excel Services. I can honestly say that MOSS is much easier to work with and much more feature rich than it's predecessor, but there is certainly a lot to think about with respect to capacity planning.

In my current project I am working exclusively with the Business Intelligence aspects of SharePoint and am relying on SharePoints capability to deploy dashboards and KPI lists that use a combination of Excel workbooks and SSRS reports that use connections to SSAS cubes. It turns out that Excel Services requires a fair amount of processing power, which, coupled with the base requirements for MOSS, means that we require significant hardware to deploy the solution even for a small number of users. The following diagram represents the logical layout of our single server deployment:



According to Microsoft, this configuration is not recommended for a production system. I can certainly see why, given the performance requirements.

One thing that we have run across on a regular basis is that MOSS tends to really fragment memory, and even on systems with a fair amount of available memory, SharePoint and Excel Services tend to require contiguous blocks of memory, which can quickly become constrained if access to Excel Services is requested by more than a one user. In order to alleviate this problem, the HeapDeCommitFreeBlockThreshold registry setting (HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager) needs to be changed to 0x00040000 in order to ensure that the garbage collector keeps up with memory requests.

Fortunately for us, Microsoft has realized that MOSS performance and scalability is a concern for many organizations and have begun to publish very good information on MOSS deployment. I just wish that Microsoft Learning would realize that they need to publish some good MOSS deployment courses.

Take a look at the TechNet article on SharePoint capacity planning.

Tuesday, May 15, 2007

Microsoft BI Website

In case you've missed it, Microsoft now has a new "Business Intelligence" portal. Check out: http://www.microsoft.com/bi/

Shift Happens

Lynn Langit has posted an interesting presentation over on her blog. Check it out here.

Monday, May 14, 2007

Unit Testing Whitepaper published

Sachin Rekhi (Microsoft Team System for Database Professionals team) recently posted a very well done whitepaper on Database Unit testing using Team System for Database Professionals. Check it out here.

Data Dude Service Release

Microsoft has recently released Service Release 1 for Visual Studio Team System for Database Professionals. Read about it in KB 936202.

Gert Drapers has posted some information on exactly what it contains in his blog entry.

Wednesday, May 9, 2007

SQL Server 2005 Performance Reports

In their never-ending quest to make the life of a SQL DBA easier (please, don't laugh, it's really true!), Microsoft has recently released a series of custom reports that plug into SQL Server Management Studio (SSMS) that help to track down performance-related issues with SQL Server 2005 (You can download and read about them here). I recently had the opportunity to work with these reports first-hand in a production environment, and while I am normally very skeptical of such things, I have to admit that this toolset really impressed me.

Installing the Performance Reports

The installation of the Performance Reports package is relatively straightforward, although you do have to perform a couple of steps:

  1. Download the SQLServer2005_PerformanceDashboard.msi file and execute it. This will unpack a bunch of report RDL files and sql scripts into your SQL Server tools folder (default is C:\Program Files\Microsoft SQL Server\90\Tools\PerformanceDashboard).
  2. Execute the setup.sql file on each server instance that you plan to monitor with these reports. (Note: You do not need the RDL files on each server)

Using the Performance Reports

Once the reports are loaded, you access them by right-clicking on a server in the object explorer inside SSMS and selecting Reports/Custom Reports. Then browse to the installation folder (Note: You may not see any files when you browse in management studio due to the fact that it might be looking for *.rdlc files instead of *.rdl – To solve this, simply type *.rdl in the filename box) and select "Performance_Dashboard_Main.rdl". This will open the main dashboard as shown below:



Each of the items in the dashboard is capable of drilling down into more detail. For example, in my case, when I selected the "Current Waiting Requests" graph, I was presented with the following:


Notice that the report shows the specific query that is causing the wait in this case. Clicking on the query brings up the execution plan as follows:



Note that the report shows that there are identified missing indexes that could help the performance of the query. Clicking on the "View Details" link brings up the following report:


Which can now be used to create the missing index that should help the performance of that particular query. (The reality is you want to perform a more complete index analysis using the tuning wizard and a captured workload, but this is a good start!)

Now, I know not everyone is the SQL Geek that I am, but come on, this stuff is COOL!

Thursday, May 3, 2007

SQL Server 2005 Incremental Servicing Model

Microsoft has recently announced a new methodology for releasing hotfixes for SQL Server called the "Incremental Servicing Model". Basically what it means is that hotfixes for SQL Server will now be delivered on a regular schedule, instead of as priority requires.

Under this new methodology, SQL Server "cumulative update" patches will be released every 2 months.

Read more about this here: http://support.microsoft.com/kb/935897

Wednesday, May 2, 2007

SharePoint and Forms Authentication

I have recently had the need to work with SharePoint (actually Windows SharePoint Services – WSS 3.0) 2007 and ASP.NET Forms authentication. For something that in the end turns out to be simple to configure, I had a heck of a time locating viable information through MSDN and the blogosphere.

So, with that in mind, I thought I would create a very simple step by step guide to help those who find themselves in the same boat (although I know nobody reads this blog anyway)

Configuring SharePoint 2007 / WSS 3.0 to use Forms Auth

One disclaimer that I'll give here is that I am going to give instructions for the simplest method of configuring MOSS/WSS to use forms auth. The end-result is a global configuration and it may not be the best solution for your particular environment, nor is it a particularly smart/secure idea. With that said, you can always expand upon the ideas presented here to customize the solution for your environment.

Step by step instructions are as follows:

  1. Install and configure MOSS/WSS using whatever configuration you deem fit. (In my case, I needed a bare-bones default installation of WSS, but have also tested these steps with SharePoint 2007 Enterprise Edition)
  2. From the Windows\Microsoft.Net\Framework\v2.0.50727 folder, execute aspnet_regsql
    1. Choose Configure SQL Server for Application Services
    2. Use the default database (which will create a database called aspnetdb on whatever instance you choose in the wizard)
    3. Once the wizard is complete, use SQL Server Mgmt Studio to grant access to the user that will be the security principal for the IIS Application Pool that WSS/MOSS will use. (By default it will be NT AUTHORITY\NETWORK SERVICE)
  3. Open the machine.config file from Windows\Microsoft.Net\Framework\v2.0.50727\CONFIG
    1. Locate the <connectionStrings> element
    2. Replace the connectionString attribute for "LocalSqlServer" with an appropriate string that points to the database you created in step 2
  4. In SharePoint Central Administration, create a new web application
    1. Use the default NTLM authentication
    2. Once done, ensure you restart IIS (use IISRESET /restart from a command prompt)
  5. Create a new Site Collection using the web application you created in step 4
    1. Ensure you assign a Windows account as the site administrator (You should test the site before changing authentication types, so you'll need an account that can access the site)
  6. Ensure the new site works by browsing to it
  7. Open SharePoint Central Administration Application Management
    1. Select Authentication Providers and ensure you select the correct web application (the one you created in step 4)
    2. Set the Authentication Type to "Forms"
    3. Set the Membership Provider to "AspNetSqlMembershipProvider" (It is imperative that you spell this correctly – you can cut/paste from machine.config <membership><providers> element if necessary)
    4. Once you save the configuration, restart IIS
  8. Test the new authentication type
    1. Open the site in the browser. If all is working correctly, you will be presented with SharePoint's default ASP.NET login screen
    2. Try to login with any user/password combination. It should fail and return you to the login screen
  9. Add users to the aspnetdb database
    1. The easiest way to do this is through Visual Studio's ASP.NET web configuration utility
      1. Create a new ASP.NET website project
      2. Don't change anything and build the project
      3. From the Website menu, choose "ASP.NET Configuration"
      4. Once the tool loads, choose "From The Internet" in the authentication column
      5. Add users
  10. Test the site again
    1. Choose a valid username/password combination
    2. You should be able to login, but not access the site (SharePoint will tell you that you don't have access)
  11. Open SharePoint Central Administration Application Management
    1. In the SharePoint Site Management section, add a user as a Primary Site Administrator (Choose a user you added in step 9)
    2. In the Application Security section, add any users to the Site Policy as necessary

Once these steps are followed, you should be able to enjoy WSS/MOSS with forms authentication.


I hope these steps have proven useful.