The Odyssey of Sitecore Commerce Staging’s Reliance on SQL CE 3.5

The Problem

I recently completed a fun journey triaging a set of self-inflicted wounds around Sitecore Commerce Staging.  The rest of Sitecore Commerce 8.2.1 ran as expected except for when I setup the Sitecore Commerce Staging between a CM and CD tier . . . I would get an odd “The system cannot find the file specified” dialog box immediately after trying to start a basic Staging replication project:

cannot find the File

And when I say that pop-up appears immediately after clicking the “Start Replication” button, I mean it — we’re talking almost instantaneously.

Background

I’ll pause here just to lay out why one cares about Commerce Server Staging with Sitecore.  The Commerce Staging documentation is on MSDN in a variety of places, and Sitecore is using this sub-system to manage Commerce specific content promotion (such as product catalog changes, promotion codes, etc).  As explained in the summary on CommerceSDN.Sitecore.net :

The role of Commerce Server Staging is to move Commerce Server data between environments or sites.

If the master and web environments are pointing to the same Commerce Server site instance, as soon as data is changed on the master environment, it will be published to web. To ensure that Commerce Server data is not published unexpectedly, it is strongly recommended that you have one Commerce Server site for your Content Management (CM) environment, a separate Commerce Server site for your Content Delivery (CD) environment, and then use Commerce Server [Staging] to move the data.

I added the [Staging] to the final sentence above, as I think it’s pretty key to understanding the topic.

Commerce Server Staging is the vehicle one uses to manage Commerce data the same way we manage marketing data in the rest of Sitecore.  We create and update Sitecore CMS content in a “master” SQL Server database and then promote it to “web” with a Sitecore Publish operation . . . then we create and update Sitecore commerce content via the Sitecore CM and promote changes to the live website via Sitecore Staging operations.  The Sitecore Publish process has been extended with Sitecore Commerce to include Staging, so you can run it as a single integrated process.  Note the “Commerce Server Staging” checkbox included in the Sitecore Publish dialog below:

StagingPublishDialog

It’s usually magic and just works — but I was not able to run any type of Commerce Staging operation, so I needed to peel back the curtain and learn a bit more about how to troubleshoot Commerce Staging.

The Solution

This write-up on Monitoring Commerce Staging laid the foundation for my eventual resolution.  Before exploring that angle, however, I reviewed this write-up with some basic pitfalls on Commerce Staging . . . of course there’s this gem about DCOM permissions (& my write-up on applying that setting when the GUID isn’t readily available) . . . Sitecore’s KB site has some suggestions and the Community.Sitecore.net has some suggestions. There are hundreds of nooks and crannies one can investigate here, so it took me some time to discover what was going on.  Hopefully, my adventure can inform others, so here goes . . .

  1. The link about monitoring Commerce Staging (https://msdn.microsoft.com/en-us/library/ms961837(v=cs.70).aspx) talks about ways to configure what is logged and how to view the events of the Staging process.  I digested this info and, while the write-up assumes some old IIS configurations, I discovered the C:\Program Files (x86)\Commerce Server 11\Staging directory has two folders of particular interest to me: Data and Events.  The Events folder has .mdb and .ldb files (yes, that’s Microsoft Access) that form a type of localized Event Log for Commerce Staging; the Data folder has a StagingLog.sdf file — .sdf is a format used by SQL Server Compact Edition
  2. I opened the Access database and found that it didn’t have any helpful information for my scenario.  My “cannot find the file specified” exception was not in evidence there . . . but there is some interesting meta-data captured in that Access MDB for Staging.  This Events folder appeared to be a dead-end.
  3. I used LinqPad to analyze the StagingLog.sdf file (turns out Commerce Staging uses SQL CE 3.5 edition), and I learned two key things:
    • This StagingLog.sdf file was essentially empty — no data was being written to it when I tried to start a Commerce Staging operation
    • I couldn’t use LINQPad on the server running Commerce Staging as I didn’t have the proper SQL CE 3.5 drivers, but when I moved it to my developer machine I could view the .sdf file without issue.  Interesting . . .
  4. My next step was to use ProcMon (available at https://docs.microsoft.com/en-us/sysinternals/downloads/procmon) and compare a process capture from my environment that raised the exception with a properly functioning Commerce Staging environment.  This took some trial and error, but filtering on the CSSsrv.exe helped me focus on the task at hand.
    • The properly functioning Staging environment had a lot of activity with the StagingLog.sdf file.  We’re talking hundreds of operations.  The broken Staging environment had none.  Once I tracked ProcMon instruction sets 1:1 between the two environments I found the final piece of evidence I needed:

SqlCE

The highlighted area above, C:\Program Files (x86)\Microsoft SQL Server Compact Edition\v3.5\sqlceer35EN.dll entries, were entirely missing from the initialization process of my broken Commerce Staging environment.

The lightbulbs finally went off for me, and I did a quick comparison of the C:\Program Files (x86) directories of the two servers.  There were key components missing from the environment where Staging wasn’t working, and it revolved around SQL Server Compact Edition.  Here’s a highlight showing what obviously wasn’t on the server in question:

dirs

Neither MS SQL Server Compact Edition nor MS Synchronization Services were installed to this server.  I did some further research, and learned that the CommerceServer.exe should fold in these elements with any Commerce install, so I needed to do a repair installation with the CommerceServer.exe and that added SQL Server CE and other dependencies necessary to solve the issue.

Our PowerShell that installs the CommerceServer.exe must not have been using a properly elevated account, or this was done with a manual execution of the installer that didn’t use the proper administrative credentials.  Everything worked fine after I completed the repair install of Commerce Server.

There are some additional items to point out here:

  • Based on my testing, the CommerceServer.exe may exclude some key Staging dependencies if it’s not run in a proper Administrator context.  I thought it was written in the docs somewhere, but I don’t see it in Sitecore’s documentation on installing Commerce Server .exe — running that EXE in the context of a local admin account is crucial to the success of that process.  I use this account and the shift + right-click -> Run as different user” option when launching the CommerceServer.exe manually, and this has been successful for me.
  • If you’re like me, and love the post-script to a good story,  here’s what LINQPad shows for that StagingLog.sdf once you’ve run a couple successful Staging operations:

Linqpad

  • I think additional take aways from this write-up are:
    • there’s an MS Access database in C:\Program Files (x86)\Commerce Server 11\Staging\Events that could be a source of interesting diagnostic info on your Sitecore Commerce Staging activities
    • there’s an SQL CE database in C:\Program Files (x86)\Commerce Server 11\Staging\Data that could be a source of interesting diagnostic info on your Sitecore Commerce Staging activities
Advertisements

How a 13 year old archived list serv helped me out with Sitecore Commerce

Sitecore Commerce is an interesting landscape — it’s never a dull moment.  After recently swapping the backing store from Azure SQL to SQL Server (due to an interesting Inventory gotcha with the Reference Storefront that I’ll maybe share at some other time), I’m finding nooks and crannies of configuration I never knew existed with the Commerce product until now.

After I migrated to Azure IaaS SQL Server VMs from Azure SQL, I thought I had everything tidied up.

  • Commerce Server Manager references?  ✔
  • Sitecore application connection strings?  ✔
  • Bootstrap configuration (I posted this gist on manipulating those files to make this easier)? 

I updated the Azure SQL database credentials to prove that I had no lingering connections to Azure SQL.  I encountered an exception at Sitecore start-up related to initialization of the profile service, however, and had to start digging.  CommerceProfileSystemException was the exception type and the stacktrace started as follows:

Exception type: CommerceProfileSystemException 
 Exception message: Failed to initialize profile service handle.
 at CommerceServer.Core.Runtime.Profiles.ProfileContext..ctor(String profileServiceConnectionString, String providerConnectionString, String bdaoConnectionString, DebugContext debugContext)
 at CommerceServer.Core.Runtime.CommerceContextFactory.CreateProfileContext()
 at CommerceServer.Core.Runtime.CommerceContextFactory.get_ProfileContextSingleton()
 at CommerceServer.Core.Runtime.Profiles.CommerceProfileModule.get_ModuleProfileContext()
 at CommerceServer.Core.Runtime.Profiles.CommerceProfileModule.get_ProfileContext()

The Commerce Server Manager encapsulates the connection strings, and I thought I had them all updated to the SQL Server VM equivalents, even going so far as to inspect MSCS_Admin in SQL Server with a query like this:

SELECT [i_ResourceID]
 ,[s_PropertyName]
 ,[s_Value]
 FROM [MSCS_Admin].[dbo].[ResourceProps]
 where f_IsConnStr=1
 ORDER BY 1

While interesting to find where this information is stored (may or may not be permanent, though — tough to tell with Commerce!), this output didn’t shed light on what might be going on, though:

Eventually I stumbled across some 13 year old documentation on Commerce Server discussing updating the ProfileService data source in some detail (http://microsoft.public.commerceserver.general.narkive.com/NPLMLusv/commerce-2002sp3-on-windows-2003-can-t-change-profiles-data-source).  It turns out, this 13 year old solution was completely applicable to my 2017 Sitecore Commerce predicament.

Succinctly, within Commerce Server manager you should do the following:

  1. Expand the Commerce Server “Global Resources” node, then “Profiles” node, then “Profile Catalog” node, then “Data Sources” node, and finally expand the “ProfileService_SQLSource” node
  2. Click on the Partitions node:
  3. In the right pane, there’s a SQLSource element you right-click and choose “Properties”
  4. Select the Partitions Tab, then “Edit” the connection string
  5. Make your connection string modifications here.  This is where my elusive reference to Azure SQL was hiding and causing Sitecore to fail to initialize.

The more work I do with Sitecore Commerce, the more I’m appreciating the value of the older documentation targeting previous editions of the product.  The catch is, it’s not 100% relevant to the modern experience with Sitecore Commerce . . . and knowing what is and isn’t applicable to the Sitecore Commerce 8.2.1 world is a challenge.  I think we’re getting there, a little bit at a time!

A note on the Sitecore Commerce DCOM Config Permissions

I’ve been having lots of fun with multiple Sitecore Commerce projects and Azure SQL lately . . . here’s one quick note I can share that might save some hassle for others working in this space . . .

This Sitecore link mentions a DCOM permission to verify in the course of troubleshooting some Sitecore Commerce “Staging” issues.  My problem was that this guid, 7E95698D-CD3C-4C98-93C7-6510C31F7DDF, wasn’t visible in the Component Services treeview under “DCOM Config.”

Sitecore support informed me that they needed to update their documentation to mention that in the absence of that Guid, one should locate the “CSS Replication Server” object in the treeview and proceed from there.

My screen shot shows where the illusive Guid is stored as a property of the CSS Replication Server entry: DCOM

I expect Sitecore will update their documentation shortly, so this blog post may have a brief shelf-life in terms of relevancy . . . but if you’re like me and the Commerce platform, any notes are appreciated, so I’ll see if this helps anyone in the community.

Encrypting Sitecore connection strings for Sitecore Commerce, Azure SQL, and beyond

There’s been a lot of Sitecore Commerce on my plate this Summer, and sadly one limitation of using that product for some customers is the requirement for SQL Server authentication instead of Active Directory and Windows Auth; I won’t get into why they need SQL auth at this point, but trust that in many use-cases this is a necessity.

In an effort to always deliver a secured platform for customers, at Rackspace we encrypt the App_Config/connectionStrings.config file to avoid having plaintext usernames and passwords on disk.    This is a link to our Rackspace GitHub “gist” performing such encryption with the ASP.Net tool aspnet_regiis.exe.  The logic is also there to un-encrypt, in case that’s necessary.

Encryption success
You can update the $configLocation variable at the top of the script to point to where your Sitecore installation is located; you then run this script using PowerShell, and you’ll get an output like this.

Once you’ve run the script, your connectionStrings.config file will resemble this:

Before you get too excited, for Sitecore Commerce in the current incarnation, there are several other plaintext passwords on disk in the \CommerceAuthoring\wwwroot\data\Environments and related .json files for both SQL and Sitecore.  The PowerShell I’ve shared doesn’t address those areas.  The Sitecore Commerce documentation does a good job of cataloging where to find these references, at least, but this still leaves a lot to be desired in terms of security.

I’m not going to go too far down this path, since I mostly wanted to post the PowerShell we use to automate SQL Server connection string encryption.  This technique can be useful for a variety of projects, not just for Sitecore Commerce — although this is the use case we’re repeatedly seeing right now.  If I have time, I’ll share some other Sitecore Commerce tips around Azure SQL friendly deployments (Sitecore’s documentation is a decent start, but lacking in some respects).

Here’s the script to encrypt/decrypt those Sitecore connectionStrings.config file:

A few Solr thoughts

Solr has never been more pervasive through the Sitecore projects I’m seeing these days.  Deciding which version of Solr for a greenfield Sitecore project, however, is not clear-cut.

Easy answer: use Solr 5.1

Sitecore’s KB article on compatibility with Solr serves as our official reference when it comes to selecting a Solr version to standardize on.  At face-value, if you’re using Sitecore version 8.2, you’re steered to Solr version 5.1:

SolrCompat

The diagram has a note [3], however, that is worth noting:

WARN  Unable to connect to Solr: [http://{hostname}:{port}/solr], the [SolrNet.Exceptions.SolrConnectionException] was caught.
Exception: SolrNet.Exceptions.SolrConnectionException
Message: Error handling 'status' action
org.apache.solr.common.SolrException: Error handling 'status' action
  • “To resolve issue, upgrade Solr to 5.5.1 or later version.”

Easy answer: use Solr 5.5.1

I asked Sitecore support about this, and in fact the guidance I received from Sitecore Support was to build on Solr version 5.5.1 instead of what the KB article states.  There are no plans to alter the guidance in that KB article, however, since Sitecore 8.2 as a whole platform was thoroughly tested with Solr 5.1.  Apparently, Solr 5.5.1 was not available at the time of that testing.

Anecdotally, Sitecore has found fewer errors when using Solr 5.5.1 instead of Solr 5.1 — when pressed for specifics, it was shared that these two Solr issues have caused problems for other Sitecore implementations:

  1. https://issues.apache.org/jira/browse/SOLR-8793
    • FileNotFoundException or NoSuchFileException with Solr — see comment from Sitecore KB article that it can cause “Unable to connect to Solr” exceptions in some cases
  2. https://issues.apache.org/jira/browse/LUCENE-7188
    • NRTCachingDirectory error where an IllegalStateException exception is thrown

Easy answer: there are no easy answers

I’ve worked with a number of Solr 5.1 projects with Sitecore, and some using other Solr versions prior to Solr 5.5.1, but haven’t encountered the above errors as major impediments.

It’s tempting to use Solr 5.5.1, but if a project is using EXM or WFFM or Sitecore Commerce or some other combination of technology edge case, it’s at least theoretically possible that Sitecore support could fall back on the officially published “Solr 5.1 ✓ ‘officially tested, recommended'” guidance from their KB article.  That’s enough for us to approach new Sitecore projects depending on Solr to go with Solr version 5.1 and keep an eye out for those particular gotchas that may cause us to upgrade to Solr 5.5.1.

The catch is, if you’re upgrading Solr and stopping at Solr 5.5.1 — is there a strong rationale not to upgrade beyond  5.5.1?  At this point, http://archive.apache.org/dist/lucene/solr/ has a wealth of newer Solr versions that are bound to have more patches and fixes that 5.5.1.  This is what you call a slippery slope:

solrslippery.JPG

I have to be careful here as I walk the line of a non-discolosure agreement, but there are still more variables to consider: in the near future, a Sitecore release is likely to involve thorough Solr support for a very recent version of Solr.  Expect a Solr version newer than 5.5.1 (which was released May of 2016 ☺).

So…

I believe I’ve sold myself on the wisdom of Solr 5.1 for now — so long as the sacred Sitecore Support ✓ is present on the official compatibility table.  It’s key to continue learning with Solr, though, and in the months to come we may be talking about SolrCloud and managed Solr schemas . . . cool new aspects to improve Sitecore implementations.

Sitecore Session Persistence Notes

I’ve neglected this blog of late, being focused on a number of “not easily blogged about” scenarios across several Sitecore projects.  It’s too bad, because the work is very interesting, but it doesn’t lend itself to a page or two write-up with a digestible take-away for the general Sitecore community out there.

I do want to keep in the habit of blogging, though, so I’m going to mention this ongoing discussion I’ve been a part of about session management with regards to Sitecore.  There are a few options for managing HTTP session state with Sitecore covered in https://doc.sitecore.net/sitecore_experience_platform/setting_up_and_maintaining/xdb/session_state/session_state: SQL Server, MongoDB, and Redis.  Those three technologies are really just the tip of the mountain, as implementation details for each can get quite detailed.  For the discerning Sitecore implementation, it can be useful to understand the nuances of each session state provider.  While not an exhaustive look at any one of these solutions, I wanted to post some notes on each one given the current state of Sitecore architecture (June 2017):

SQL Server

This is often the default session provider we gravitate to.  The SQL Server “Boost” script from Sitecore is something we’ve used on implementations (see “Optimize SQL Server performance” on that link), but it is not without it’s rough edges (see our Rackspace write-up on how to alter permissions so TempDB is reliably available across service restarts).

You’ll notice the approach for improving SQL Server performance with session state is all about getting session state “in-memory” to the furthest extent possible.  Remember this when we examine the other two providers below . . .

I will say that, generally speaking, SQL Server is easy to administer as it’s a well-known technology and updating it, scaling it, managing fail-overs, etc is simple compared with the alternatives.  SQL Server has been part of the Windows dev stack for ages, now, so it’s often the default session provider one gravitates to.

MongoDB

With MongoDB serving as the persistence layer for Sitecore’s xDB, it became a fully supported and viable option for HTTP session state with Sitecore at the same time.  The comparative performance between MongoDB and SQL Server is up for debate (Redis too, for that matter!), and it usually comes down to testing based on how the specific implementation is using session with Sitecore etc; I’m not going to hazard any generalizations on relative perf, as that’s not really the point of this post.

Instead, I’d like to point out how MongoDB does not come in just a single flavor.  The two most common flavors, or “storage engines,” are MMAP and WiredTiger, but there are still others designed to serve specific use cases.  Take, for example, the Percona Server for MongoDB hosted by ObjectRocket that has a posted option for the RocksDB storage engine.  RocksDB with MongoDB may not be a great fit for Sitecore session state (RocksDB is tuned for write-heavy work loads — and, in some cases, if you’re making extensive use of TTL indexes for Sitecore then RocksDB fits those scenarios in certain appealing ways), but it does open the door to MongoDB being more than just a one-size-fits-all data repository (read more about RocksDB and it’s Facebook pedigree here).  One MongoDB storage engine option that is easily overlooked is for WiredTiger “in-memory” that will force data to be stored in RAM . . . and this is perfect for HTTP Session State for most Sitecore builds.

In fact, if you consider the SQL Server “boost” approach that uses TempDB to store session state for Sitecore . . . WiredTiger “in-memory” is attacking the problem from the same direction.  Store everything in RAM!  This is why one must be cautious with general comparisons between SQL Server and MongoDB, the devil is always in the details: a far better comparison would be “boosted” SQL Server for Sitecore using TempDB vs MongoDB WiredTiger “in-memory” storage engine.  And note the network latency . . . and the size of the session objects . . . and you’re getting the point, I trust.  To really answer the SQL Server vs MongoDB question for Sitecore sessions, one has to develop a matrix of performance evaluations and level assumptions across the board.  “It depends” is the only honest answer that doesn’t come with a list of caveats.

If you’re curious on this MongoDB topic for your project, go to http://objectrocket.com/docs/mongodb_plans.html and spin up a WT 3.2 storage engine plan for 5 GB of storage (this allows 1.5 GB for RAM).  1.5 GB for RAM is going to be overkill for most small/medium Sitecore implementations — but again, you’ll want to test with your specific session data set to see!  Furthermore, network latency of 10 ms or less is going to help make the most of an ObjectRocket hosted MongoDB service like this — otherwise, the network latency may not make it worth the money.  Let me know if you pursue this with ObjectRocket, as there are some benchmarking measures we want to do but we haven’t had a real implementation to try it out on.  So if you feel like being a guinea pig, please let me know at grant.killian [at] rackspace.com.  It would be great to have real world metrics to prove this all out.

Redis

If the way to get the best session management perf out of SQL Server and MongoDB is to find in-memory solutions, Redis looks like the slam dunk since it’s just an in-memory storage solution.  We find most clients aren’t interested in managing Redis infrastructure, so again a hosted option such as ObjectRocket has appeal.

Sitecore relies on the StackExchange.Redis assembly, which doesn’t support Redis Sentinel — it’s a bit of a saga at https://github.com/StackExchange/StackExchange.Redis/pull/406;  therefore there’s not a great high availability story with the self-hosted Redis and Sitecore right now.  How concerned one should be with HA of fairly transient HTTP Session State for Sitecore, however, is an open question.  I usually wouldn’t worry about it too much.  Honestly, Redis is a technology that we’re just now starting to get really serious about at Rackspace so our sophistication in this space will improve dramatically in the months to come.  Between Azure Redis and all the Sitecore PaaS movement we’re seeing, it’s become a key player in a lot of Sitecore architectures.

xDB Reporting Database Rebuild Help

I’ve created something like this every time I need to rebuild the Sitecore “reporting” database (this link covers the basic process), this time I’m posting it online so I can re-use it next time around!

This is the script for generating the T-SQL that’s required to complete step #3 in the write-up when you’re following the “Rebuild Reporting Database” instructions:

“In the Rebuild Reporting Database page, when you see Waiting to receive to data status, copy the following marketing definition tables from the primary to the secondary reporting database”

I have written the SQL several times to do this, but this time I took a run at DRY (don’t repeat yourself) to script this SQL out.  Alas, I think my T-SQL comes in at 40+ lines of code versus the raw SQL to run which is just 35 lines and much easier to read, in my opinion.

Either way, you can pick which you prefer as I’ll share them both here

First, the plain vanilla SQL commands for copying those database tables:

INSERT INTO target_Analytics.dbo.CampaignActivityDefinitions
         SELECT source_Analytics.dbo.CampaignActivityDefinitions.*
         FROM  source_Analytics.dbo.CampaignActivityDefinitions ;

INSERT INTO target_Analytics.dbo.GoalDefinitions
         SELECT source_Analytics.dbo.GoalDefinitions.*
         FROM  source_Analytics.dbo.GoalDefinitions ;

INSERT INTO target_Analytics.dbo.OutcomeDefinitions
         SELECT source_Analytics.dbo.OutcomeDefinitions.*
         FROM  source_Analytics.dbo.OutcomeDefinitions ;

INSERT INTO target_Analytics.dbo.MarketingAssetDefinitions
         SELECT source_Analytics.dbo.MarketingAssetDefinitions.*
         FROM  source_Analytics.dbo.MarketingAssetDefinitions ;

INSERT INTO target_Analytics.dbo.Taxonomy_TaxonEntity
         SELECT source_Analytics.dbo.Taxonomy_TaxonEntity.*
         FROM  source_Analytics.dbo.Taxonomy_TaxonEntity ;

INSERT INTO target_Analytics.dbo.Taxonomy_TaxonEntityFieldDefinition
         SELECT source_Analytics.dbo.Taxonomy_TaxonEntityFieldDefinition.*
         FROM  source_Analytics.dbo.Taxonomy_TaxonEntityFieldDefinition ;

INSERT INTO target_Analytics.dbo.Taxonomy_TaxonEntityFieldValue
         SELECT source_Analytics.dbo.Taxonomy_TaxonEntityFieldValue.*
         FROM  source_Analytics.dbo.Taxonomy_TaxonEntityFieldValue ;

And now, here’s the T-SQL attempt to “simplify” the process of creating a script like the above for future projects (yet I prefer it less to the brute force approach):

The advantage to the below is you set your source and target variables to the names of the SQL Server databases, and then you’re all set.

DECLARE @source VARCHAR(100)
DECLARE @target VARCHAR(100)
SET @source = 'source_Analytics'
SET @target = 'target_Analytics'

SET NOCOUNT ON
--List approach will work in SQL Server 2012 only
DECLARE @ListOfTables TABLE(IDs VARCHAR(100));
INSERT INTO @ListOfTables
VALUES('CampaignActivityDefinitions'),
  ('GoalDefinitions'),
  ('OutcomeDefinitions'),
  ('MarketingAssetDefinitions'),
  ('Taxonomy_TaxonEntity'),
  ('Taxonomy_TaxonEntityFieldDefinition'),
  ('Taxonomy_TaxonEntityFieldValue');

SET ROWCOUNT 0
SELECTX NULL mykey, * INTO #mytemp FROM @ListOfTables
DECLARE @theTable varchar(100)
DECLARE @sql varchar(1000)

SET ROWCOUNT 1
UPDATE #mytemp SET mykey = 1

WHILE @@rowcount > 0
BEGIN
    SET ROWCOUNT 0
    SELECT @theTable = (SELECT IDs FROM #mytemp WHERE mykey = 1)
    PRINT 'INSERT INTO ' + @target + '.dbo.' + @theTable + '
         SELECT ' + @source +  '.dbo.' + @theTable + '.*
         FROM  ' + @source + '.dbo.' + @theTable + ' ;'
     --use 'EXEC to run the dynamic SQL, instead of PRINT, 
     --if you're feeling brave

    DELETE #mytemp WHERE mykey = 1
    SET ROWCOUNT 1
    UPDATE #mytemp SET mykey = 1
END
SET ROWCOUNT 0
DROP TABLE #mytemp