Sending IDOCs to SAP using SSIS – MSSQL Integration Services

Normally receiving and sending IDOCs can be easily done by using BizTalk Adapter for mySAP Business in BizTalk Server which is included in Microsoft BizTalk Adapter Pack 2.0. But what if I want to send an IDOC using SSIS and BizTalk is installed on a different server? Below are the steps on how you can do that.

Note: Since SSIS 2008 only supports <= .NET 3.5 all custom assemblies that will be called within it should be using this framework.

Prerequisites:

1. WCF LOB SDK Adapter (WCF-LOB-Adapter-SDK-2010-x86.exe) – Get it here. or in BizTalk DVD Installer under BizTalk Server\ASDK_X86

2. Microsoft BizTalk Adapter Pack 2.0 x86 – Get Evaluation version here. or in BizTalk DVD Installer under BizTalk Server\AdapterPack_x86

3. SAP Libraries x86 – this needs to be installed on C:\Windows\SysWOW64 folder. See section for SAP Adapter in this Microsoft Adapter Pack 2.0 install guide.

IDOC Library:

Next step is to create an IDOC WCF Client that you can call inside SSIS.

1. Open Visual Studio -> Create a new Project Library.

2. Right click the Project -> Click Add Adapter Service Reference.

3. Set the binding to sapBinding and set the configure a URI to:

sap://CLIENT={SAP Client ID};LANG=EN;@A/{Application Server Host}/{System Number}?ListenerGwServ={Gateway Service}&ListenerGwHost={Gateway Host}&ListenerProgramId={Program Id}&RfcSdkTrace=False&AbapDebug=False.  See this help for the configuration

4. Click Configure and set the credentials, then click Connect. Since we are sending IDOCS the contract type should be set to Client (Outbound operations).

5. In the Category, browse for the specific type of IDOC and version, when generating the schema you might encounter an error : “Error returned by RfcCallReceiveEx while calling RFC: IDOCTYPE_READ_COMPLETE..”, It means that you’re selecting an incorrect version of the IDOC, you should ask your SAP resource to identify the Release Number.

Select the Send and click Add. Click OK. This will generate a WCF client that you can call inside SSIS.

6. Sign the Assembly, compile and deploy to GAC then copy to C:\windows\Microsoft.NET\Framework\v2.0.50727.

It’s necessary to copy to this folder so you can add a reference to it in SSIS.

SSIS:

1. Add a data flow.

2. Set OLE DB Source / File / etc.

3. Add a Script Task, set the script task to Script Destination. Double click the script task and change the target framework to .NET 3.5 by going to project properties.

Add reference to the IDOC Library.

Create the IDOC object and pass it to the IDOC WCF Client.

References:

Using Scripting Destination in SSIS

Sample on how to call WCF client in SSIS
To call IDOC WCF Client:

I’ve modified the code and copied it from http://technet.microsoft.com/en-us/library/cc185231(BTS.10).aspx


using System;
using System.Collections.Generic;
using System.Text;

// Add WCF, WCF LOB Adapter SDK, and SAP adapter namepaces
using System.ServiceModel;     //Change the Project to target .NET 3.5 and reference System.ServiceModel
using Microsoft.Adapters.SAP; //This assembly is under the Microsoft Adapter Pack 2.0 install folder
using Microsoft.ServiceModel.Channels; //This assembly is under WCF LOB SDK Install folder

// Include this namespace for WCF LOB Adapter SDK and SAP exceptions
using Microsoft.ServiceModel.Channels.Common;

namespace SapTypeIDOCClient
{
    class Program
    {
        static void Main(string[] args)
        {
            // variable for the IDOC client
            IdocClient idocClient = null;

            Console.WriteLine("IDOC XML client sample started");
            try
            {

		//Construct IDOC Object here. MATMAS01, PHRMAS01, SUBMAS
		var idocObj = new {IDOC OBJECT}

		//Assign Properties and segments of IDOCS here.
		//idocObj.

                // Variable for the GUID
                System.Nullable<System.Guid> adapterTxGuid;
                // string to hold the Idoc data
                string idocData;
                // string to hold the SAP transaction ID (TID)
                string sapTxId;

                // The client can be configured from app.config, but it is
                // explicitly configured here for demonstration.
                // set AutoConfirmSentIdocs property to true
                SAPBinding binding = new SAPBinding();
                binding.AutoConfirmSentIdocs = true;

                // Set endpoint address
                EndpointAddress endpointAddress = new EndpointAddress("{SAP Connection String see settings above");

                // Create client and set credentials
                idocClient = new {IDOC WCF Client}(binding, endpointAddress);
                idocClient.ClientCredentials.UserName.UserName = "YourUserName";
                idocClient.ClientCredentials.UserName.Password = "YourPassword";

                // Open the client and send the Idoc
                idocClient.Open();

                //Get a new GUID to pass to SendIdoc. You can also assign a null.
                //value to have the adapter generate a GUID.
                adapterTxGuid = Guid.NewGuid().ToString();

		//We are using the Send Method, it accepts a strongly typed iDOC (XML), SendIdoc sends FlatFile IDOC
                idocClient.Send(idocData, ref adapterTxGuid);

                // The AutoConfirmSentIdocs binding property is set to true, so there is no need to
                // confirm the IDOC. If this property is not set to true, you must call the
                // RfcConfirmTransID method of a TrfcClient with adapterTxGuid to
                // confirm the transaction on the SAP system.

                // Get SAP tx id from GUID
                sapTxId = SAPAdapterUtilities.ConvertGuidToTid((Guid) adapterTxGuid);

                Console.WriteLine("IDOC sent");
                Console.WriteLine("The SAP Transaction Id is : " + sapTxId);

            catch (Exception ex)
            {
                Console.WriteLine("Exception is: " + ex.Message);
                if (ex.InnerException != null)
                {
                    Console.WriteLine("Inner Exception is: " + ex.InnerException.Message);
                }
            }
            finally
            {
                // Close the IDOC client
                if (idocClient != null)
                {
                    if (idocClient.State == CommunicationState.Opened)
                        idocClient.Close();
                    else
                        idocClient.Abort();
                }
            }

        }
    }
}
Advertisements

Pure C# ETL (Extract Transform Load)

With regards to ETL (Extract Transform Load) the default tool to use for the job is Microsoft SQL Server Intergration Services (SSIS) hands down.  The way SSIS works is, it stores the information about the source and target in it’s metadata, all the column size, types and length are stored in it. The problem arises when data type or size changes and the metadata gets out of sync. In order to fix it, you need to open the package and refresh either the source or destination shape so it could pickup the updated metadata information.

We had this project that we need to build an ETL solution that has ff requirements:  process a complex xml file (multi-level and self-referencing structure), should support multiple version of data,  configurable in runtime and not too sensitive with the data type.  I end up building a flexible ETL solution using C#  with the help of Unity Framework, AutoMapper, LinqToSql and stored procedure for bulk insert.

Library information:

1. Unity – dependency injection framework.

2. Automapper – object auto-mapping

The processing step  is broken down into 6 parts:

1. When xml comes in it will be Deserialize into an object.

2. Using reflection loop thru the object (every properties).

3. Get the type of the object and create data processor using Unity if there’s no instance, if there’s an instance invoke the existing one.

4. Call the data processor to convert from deserialize format to the data entities using AutoMapper and add the converted instance to it’s internal collection (collection of objects).

5. After looping, instruct every data processor to commit the data.

6. Get Processing and error information.

 Implementation:

Step 1. Deserialization

1.1 First you need a schema for the xml, then use the XSD.exe tool to generate the class that can be used in deserialization process.

1.2 To deserialize an xml to the object use the following code:

/// <summary>
/// Deserialize Xml
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="path"></param>
/// <returns></returns>
protected T DeserializeFile <T> (string path)
{
T item;
using (StreamReader sr = new StreamReader(path))
{
XmlSerializer xs = new XmlSerializer(typeof(T));
item = (T)xs.Deserialize(sr);
}
return item;
}

Step 2,3& 4. Loop thru every properties using Reflection, for every object create a data processor using Unity then instruct it to add it to it’s internal collection. Each data processor would hold a collection equal to it’s type:

To loop to every property of the object using reflection use the following code:


private DataProcessorManager mDataProcessorManager;

/// <summary>
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="item"></param>
/// <param name="parentID"></param>
protected virtual void Parse<T>(T item, string parentID)
{
if (item != null)
{
Type obj = item.GetType();
//Check if object is an array
if (item.GetType().IsArray)
{
Array array = item as Array;
//Loop every object and process it one by one
foreach (var o in array)
{
//Process the object and retrieves the reference GUID
ProcessSegment(o, parentID);

string refNo = o.GetPropertyValue("GUID");
//Check whether the object contains other properties that is an object or array of an object aside
//from primitive types or types to be skipped.
foreach (PropertyInfo p in o.GetType().GetProperties())
{
if (IsPropertyToBeProcessed(p))
{
//Get Object property
var prop = p.GetValue(o, null);
//Recursive Call
if (prop != null)
{
Parse(prop, refNo);
}
}
}
}
}
else
{
//Process the object and retrieve the GUID
ProcessSegment(item, parentID);
string refNo = item.GetPropertyValue("GUID");
//Check whether the object contains other properties that is an object or array of an object aside
//from primitive types or types to be skipped.
foreach (PropertyInfo p in obj.GetProperties())
{
if (IsPropertyToBeProcessed(p))
{
//Get Object property
var prop = p.GetValue(item, null);
//Recursive Call
if (prop != null)
{
Parse(prop, refNo);
}
}
}
}
}
}

/// <summary>
/// </summary>
/// <param name="item"></param>
/// <param name="refNo"></param>
protected virtual void ProcessSegment(object item, string refNo)
{
//Get the type of object to be processed
Type itemType = item.GetType();

//Top Level (Skip)
if (!mToBeSkippedTypeList.Exists(i=> i.Equals(itemType)))
{
//Check whether there's a data processor for that object
mDataProcessorManager.ConvertAndAdd(item, refNo);
}
}

/// <summary>
/// Check whether this object is what we need to process, exclude
/// primitive types like Int32, String and etcs.
/// </summary>
/// <param name="propertyInfo"></param>
/// <returns></returns>
protected bool IsPropertyToBeProcessed(PropertyInfo propertyInfo)
{
bool toBeProcessed = true;
if (propertyInfo == null)
{
toBeProcessed = false;
return toBeProcessed;
}
//Check whether the object is not primitive int, byte etc. and not on the list of type
//to be skipped.
if (mToBeSkippedTypeList.Exists(i => i.Equals(propertyInfo.PropertyType)))
{
toBeProcessed = false;
}
return toBeProcessed;
}

To create an instance of data processor using Unity:


//Resolve Processor Using Unity
try
{
//Try to create a data processor using Unity, basically all DataProcess inherits from DataProcessorFactory
DataProcessorFactory f = mUnityContainer.Resolve<DataProcessorFactory>(itemType.Name);
//In this method call AutoMapper for explicit mapping, those property names that doesn't match
f.SetMapper();
mProcessorCollection.Add(itemType, f);
}
catch (ResolutionFailedException unityExc)
{
//Add the itemType to unsupported object
if (!mListOfUnsupportedTypes.Contains(itemType))
{
mListOfUnsupportedTypes.Add(itemType);
}
isSuccess = false;
}

DataProcessorFactory abstract methods that each data process should implement.

It means 1 data processor = 1 object type.


/// <summary>
/// This method is intended for converting a source object to destination.
/// This should be implemented by derived class since converting may differ per object.
/// In this method, AutoMapper is called for conversion and the converted object
/// is added to the list/collection that will be processed when Execute method is called.
/// </summary>
/// <typeparam name="TSource">Type of object to be converted and added</typeparam>
/// <param name="item"></param>
/// <param name="refNo">Reference Number of parent object in case of hierarchical processing</param>
/// <returns>Returns true if successful</returns>
public abstract bool ConvertAndAdd<TSource>(TSource item, string refNo);

/// <summary>
/// Perform processing on objects that is/are on the collection
/// This is where data persisting occurs.
/// </summary>
/// <returns></returns>
public abstract bool Execute();

/// <summary>
/// This is where the AutoMapper configuration is set.
/// Since AutoMapper can only map using the same property name; and the identity of
/// the object is always in the GUID element I have to map the ID of the Destination to GUID of source object
/// </summary>
public virtual void SetMapper()
{
}

To commit the data you just need to call the Execute method of every data processor instance.

SSIS – Centralize Connection Repository

Maintaining a central repository for all connection information for SSIS packages is important especially if you have multiple packages that connects to one or more different systems. Keeping track and performing changes on the connections will be hard if not done properly.

This blog entry will describe on how to use SQL Configuration Type to store the connection info in combination with the traditional File ConfigurationType (dtsConfig) to store package specific configurations.

1. Setting up the configuration store

A table with definition below needs to be created on an new/existing database, this will store the connection information.

SSIS Configurations
Column Name Type Description
ConfigurationFilter

Nvarchar(255)

This field is used by SSIS to indentify a set of property/values pairs that are part of the same configuration entry in the Package Configurations Organizer.
ConfiguredValue

Nvarchar(255)

It stores the value that is used to update the package property specified in PackagePath column
PackagePath

Nvarchar(255)

The path that point to the property being configured
ConfiguredValueType

Nvarchar(255)

the SSIS data type of the property being configured

2. Environment Variable

In this step, we would create an environment variable which will store the connectionString information to the configuration store (see step1).

1. On the Start menu, click Control Panel.

2. In Control Panel, double-click System.

3. In the System Properties dialog box, click the Advanced tab, and then click Environment Variables.

Figure 1 – System Properties

4. In the Environment Variables dialog box, in System variables frame, click New.

In the New System Variables dialog box, type SSISConfiguration in the Variable name box, and Data Source={DatabaseServer};Initial Catalog={DatabaseName};Provider=SQLNCLI10.1;Integrated Security=SSPI;Auto Translate=False; in the Variable value box.

5. Click OK.

6. Click OK to exit the System Properties dialog box.

3. SSIS Package Settings

It is assumed that the package is already in the final phase of development and the only thing needs to be done is to update it to retrieve settings from the configuration store and use it during runtime.

 3.1 Add Configuration Store connection

Since we are going to retrieve connection settings from configuration store that is a table we need to define a new OLE DB connection that does that.

1. In the Connection Managers, right click New OLE DB Connection.

2. Click New, Set the valid credentials to connect then on the Connect to a database server(this is the connection to cofiguration store) , Select <{Database}>.

3. In the Connection Managers rename the added connection to OLEDB_SSISConfiguration.

 3.2 Configure Configuration Store Connection

In the following steps we are going update the connection to configuration store to use the environment variable we defined step (3.1)

1. Click SSIS Menu, Package Configurations. Package Configurations Organizer opens.

2. Click Enable package configurations.

3. Click Add, Click Next.

4. In the Configuration Type, Select Environment variable.

In the Environment variable drop-down, Select SSISConfiguration.

Click Next.

Note:
If the SSISConfiguration is not on the list try re-opening the BIDS (Business Intelligence Development Studio)

5. In the Select Property Page in the Connection Managers,

Click OLEDB_SSISConfiguration,

Click Properties, Select ConnectionString.

Click Next.

6. Type OLEDB_SSISConfiguration in the Configuration name and click Finish.

7. It should look like this:

3.3 Configure existing connection to use configuration store.

 After the connection to configuration store is properly setup the next step is to update existing connections in the package to retrieve settings from it.

Important:
Repeat the steps per connectionin the Package. If the package has connection to 2 database, the steps below needs to be repeated 2x.Example below illustrates an SSIS package that has connection to a database: SampleDB.

1. Click SSIS Menu, Package Configurations.

Package Configurations Organizer opens.

2. Click Add, Click Next.

3. In the configuration Type, Select SQL Server.

In the Connection, Select OLEDB_SSISConfiguration

In the Configuration Table, Select [dbo].[SSISConfigurations]

In the Configuration Filter, Set it to the <name of the connection>,

Ex. OLEDB_SampleDB.

Click Next.

4. In the Connection Managers, Select the connection (name of collection in step 3), Properties, Select ConnectionString.

Click Next.

Important:
Select only one ConnectionString in this step since the entire step will be repeated per connection. This makes sure that the settings defined in configuration store can be reused in other packages.

5. In the Configuration Name, type the name of connection (in step 3). Click Finished.

6. Repeat steps for other connections (if applicable).

7. Update the SSISConfiguration Table if SQL Login is used since SSIS doesn’t save sensitive information such as password.

Note:
Set the connection manager to DelayValidation=TRUE if during the execution the validation fails. For some reason, it uses the value in the package and not the values from the SQL Configuration store (Database).
If you encountered this error:“Cannot insert configuration information into the configuration table. (Microsoft.DataTransformationServices.Wizards)   Could not complete wizard actions.”

It means that one of the properties exceed the length limit of the SSIS configuration table, normally it’s the connection string that causes the problem. Try to shortened the connection string by truncating the Application Name property or try to giving it a short name.

SSIS How to copy files from one server to another

The common question is how to copy/move/delete files across shared folder using SSIS.
Quick answer is to use the File System Task for the actual file operations, but how about the permissions?
By default it assumes that the account the you’ll use to execute the package (SSIS) either the SQL Server Agent identity (using sql job) or account that executes DTExec have appropriate permissions on the source/destination server.

To get the identity of SQL Server Agent:

1. In the command prompt, type services.msc, this would open the list of all services on the server.

2.  In the list of services, look for SQL Server Agent, Right Click -> Properties. Go to Logon Tab

Using Proxy. If it’s other identity than the Agent:

1. In the Security -> Credentials -> Create new credentials (username and password)

2. then go to SQL Agent -> Proxies -> SSIS Package Execution -> Create a new Proxy and assign the credentials. Then on the sql job itself assign the Run as = proxy.

Using NET USE command, If you need to connect to multiple resources (shared folder) or doesn’t use SQL Server Agent.

Issue the NET USE command before the File System Task. Example syntax:

net use RPAULO\\SharedFolder /user:RPAULO\SSISUser p@assword

for NET USE command refrence, click here.


SSRS – Calling Reporting Services using WCF and using the HTML output for email notification

Recently we had a project that requires us to use Reporting Server for email notification. In short,  the HTML output of the Report is assigned to the body of the email notification.

1) First step is to create a proxy library utility containing reference to Reporting Server Web Service. To do this Click Add Service Reference and set url to:

 http://{ReportServerName}/ReportServer/ReportExecution2005.asmx, In the collection type set it to System.Collections.Generic.List

Note: Version of SSRS is 2005

Update!: Problem with Authorization:

“The HTTP request is unauthorized with client authentication scheme ‘Negotiate’. The authentication header received from the server was ‘NTLM’.”

To fix problem with Authorization issue in LoadReport method:

1. Added new method called GetDefaultBinding.

2.Also a new parameter in Export method named  useDefaultBinding with value set to false was added. If this value is set to true, it would use the binding from GetDefaultBinding and will use DefaultNetworkCredentials.

Note: Network credential is not being used when useDefaultBinding is set to true.

Updated code is as follows:

  /// <summary>
    /// Utility class that renders and exports a SQL Reporting Services report into the specified output format.
    /// </summary>
    public static class ReportExporter
    {
        private static string GetExportFormat(ExportFormat f)
        {
            switch (f)
            {
                case ExportFormat.XML: return "XML";
                case ExportFormat.CSV: return "CSV";
                case ExportFormat.Image: return "IMAGE";
                case ExportFormat.PDF: return "PDF";
                case ExportFormat.MHTML: return "MHTML";
                case ExportFormat.HTML4: return "HTML4.0";
                case ExportFormat.HTML32: return "HTML3.2";
                case ExportFormat.Excel: return "EXCEL";
                case ExportFormat.Word: return "WORD";

                default:
                    return "PDF";
            }
        }

        /// <summary>
        /// Exports a Reporting Service Report to the specified format using Windows Communication Foundation (WCF) endpoint configuration specified.
        /// </summary>
        /// <param name="url">Address of Report Web Service</param>
        /// <param name="clientCredentials">Network Credential to use to connect to the web service</param>
        /// <param name="reportName">Friendly name of Reporting Services Report to execute</param>
        /// <param name="report">Reporting Services Report to execute</param>
        /// <param name="parameters">report parameters</param>
        /// <param name="format"></param>
        /// <param name="output">rendering output result in bytes</param>
        /// <param name="extension">output format file extension</param>
        /// <param name="mimeType">output MIME type</param>
        /// <param name="encoding">output encoding</param>
        /// <param name="warnings">warnings (if any)</param>
        /// <param name="streamIds">stream identifiers for external resources (images, etc) that are associated with a given report</param>
        /// <param name="dataSourceCredentials"></param>
        public static void Export(string url, System.Net.NetworkCredential clientCredentials, string reportName, string report, ParameterValue[] parameters,
            ExportFormat format, out byte[] output, out string extension, out string mimeType, out string encoding, out Warning[] warnings, out string[] streamIds, DataSourceCredentials[] dataSourceCredentials=null, bool UseDefaultBinding=false)
        {

            Binding binding;
            if (UseDefaultBinding)
            {
                binding = GetDefaultBinding();
            }
            else
            {
                binding = GetBinding();
            }

            using (var webServiceProxy = new ReportExecutionServiceSoapClient(binding, new EndpointAddress(url)))
            {
                webServiceProxy.ClientCredentials.Windows.AllowedImpersonationLevel = System.Security.Principal.TokenImpersonationLevel.Impersonation;

                if (UseDefaultBinding)
                {
                    webServiceProxy.ClientCredentials.Windows.ClientCredential = System.Net.CredentialCache.DefaultNetworkCredentials;
                }
                else
                {
                    webServiceProxy.ClientCredentials.Windows.ClientCredential = clientCredentials;
                }

                // Init Report to execute
                ServerInfoHeader serverInfoHeader;
                ExecutionInfo executionInfo;

                try
                {

                    ExecutionHeader executionHeader = webServiceProxy.LoadReport(null, report, null, out serverInfoHeader, out executionInfo);

                    // Attach Report Parameters
                    webServiceProxy.SetExecutionParameters(executionHeader, null, parameters, null, out executionInfo);

                    //Set Credentials
                    if(dataSourceCredentials!=null)
                        webServiceProxy.SetExecutionCredentials(executionHeader, null, dataSourceCredentials, out executionInfo);

                    // Render
                    webServiceProxy.Render(executionHeader, null, GetExportFormat(format), null, out output, out extension, out mimeType, out encoding, out warnings, out streamIds);
                }
                catch (FaultException faultExc)
                {
                    throw new Exception(string.Format("Encountered error while generating the report: '{0}', Report Url: '{1}', Web Service Url: '{2}' Message: '{3}'",  reportName, report, url, faultExc.Message ), faultExc);
                }
            }
        }

        /// <summary>
        /// Export a Reporting Services to HTML
        /// </summary>
        /// <param name="url">Address of Report Web Service</param>
        /// <param name="clientCredentials">Network Credential to use to connect to the web service</param>
        /// <param name="reportName">Friendly name of Reporting Services Report to execute</param>
        /// <param name="report">Reporting Services Report to execute</param>
        /// <param name="parameters">report parameters</param>
        /// <param name="dataSourceCredentials"></param>
        /// <returns></returns>
        public static string ExportToHTML(string url, System.Net.NetworkCredential clientCredentials, string reportName, string report, ParameterValue[] parameters, DataSourceCredentials[] dataSourceCredentials = null, bool UseDefaultBinding = false)
        {
            Warning[] warn;
            string[] p;

            byte[] output;
            string extension, mimeType, encoding;

            //Get HTML
            ReportExporter.Export(url,
                                  clientCredentials,
                                  reportName,
                                  report,
                                  parameters.ToArray(),
                                  ExportFormat.HTML4,
                                  out output,
                                    out extension,
                                    out mimeType,
                                    out encoding, out warn, out p, dataSourceCredentials, UseDefaultBinding
                                    );

            System.Text.Encoding enc = System.Text.Encoding.GetEncoding("UTF-8");
            string html = enc.GetString(output);
            return html;
        }

        /// <summary>
        /// Returns the binding to use, eliminates the app.config
        /// </summary>
        /// <returns></returns>
        internal static Binding GetBinding()
        {
            BasicHttpBinding binding = new BasicHttpBinding();
            binding.MaxBufferPoolSize = 0;
            binding.MaxReceivedMessageSize = 5242880;
            binding.HostNameComparisonMode = HostNameComparisonMode.StrongWildcard;
            binding.TextEncoding = System.Text.Encoding.UTF8;
            binding.MessageEncoding = WSMessageEncoding.Text;
            binding.TransferMode = TransferMode.Buffered;
            binding.UseDefaultWebProxy = true;
            binding.ReaderQuotas = new System.Xml.XmlDictionaryReaderQuotas
              {
                  MaxArrayLength = 2147483647,
                  MaxBytesPerRead = 2147483647,
                  MaxDepth = 2147483647,
                  MaxNameTableCharCount = 2147483647,
                  MaxStringContentLength = 2147483647
              };
            binding.Security.Mode = BasicHttpSecurityMode.TransportCredentialOnly;
            binding.Security.Transport.ClientCredentialType = HttpClientCredentialType.Windows;
            return binding;
        }

        /// <summary>
        /// Returns the binding to use, eliminates the app.config
        /// </summary>
        /// <returns></returns>
        internal static Binding GetDefaultBinding()
        {
            BasicHttpBinding binding = new BasicHttpBinding();
            binding.MaxBufferPoolSize = 0;
            binding.MaxReceivedMessageSize = 5242880;
            binding.HostNameComparisonMode = HostNameComparisonMode.StrongWildcard;
            binding.TextEncoding = System.Text.Encoding.UTF8;
            binding.MessageEncoding = WSMessageEncoding.Text;
            binding.TransferMode = TransferMode.Buffered;
            binding.UseDefaultWebProxy = true;
            binding.ReaderQuotas = new System.Xml.XmlDictionaryReaderQuotas
            {
                MaxArrayLength = 2147483647,
                MaxBytesPerRead = 2147483647,
                MaxDepth = 2147483647,
                MaxNameTableCharCount = 2147483647,
                MaxStringContentLength = 2147483647
            };
            binding.Security.Mode = BasicHttpSecurityMode.TransportCredentialOnly;
            binding.Security.Transport.ClientCredentialType = HttpClientCredentialType.Ntlm;
            return binding;
        }
        #endregion
    }

Updated Unit Test to use default bindings:

[TestMethod]
        public void GetHTMLReportTest()
        {

            //Set network credentials
            string userName = "NotUsed";
            string password = "NotUsed";
            string domain = "NotUsed";
            System.Net.NetworkCredential credentials = new System.Net.NetworkCredential(userName, password, domain);
            string outputPath = @"C:\TestFolder";

            //Url
            string path = "/MyReport/Invoice";

            //Parameters
            List<ParameterValue> parameters = new List<ParameterValue>();
            parameters.Add(new ParameterValue { Name = "AcctId", Value = "0001" });

            //Get HTML
            string html = ReportExporter.ExportToHTML("http://HypV-Rpaulo/ReportServer/ReportExecution2005.asmx",
                                  credentials,
                                  "My invoice report",
                                  path ,
                                  parameters.ToArray(),UseDefaultBinding:true
                                  );

            Assert.AreEqual(true, html.Length > 0, "No HTML generated");
            Console.WriteLine(html);
            System.IO.File.WriteAllText(outputPath + "result.html", html);

        }

The library can be used in BizTalk (deployed in GAC) , SSIS 2005 or 2008(deployed in GAC, .NET version is <=3.5 and is copied to %Windir%\Microsoft.NET\Framework\v2.0.50727 folder to be able to referenced in a SSIS Package..

Entity Framework – The version of SQL Server in use does not support datatype ‘datetime2’

I came across this error when I’ve generated the EDMX model (Entity Framework) using a database source that is SQL Server 2008R2 and ran it in different environment that has different SQL Version (2005).

To fix this issue:

Manually open the EDMX file using XML Editor or notepad, replace the value of ProviderManifestToken from 2008 to 2005.

SQL Server 2008 R2 Logical name truncation during backup

We’re currently upgrading  an existing solution to use SQL Server 2008R2 from SQLServer 2005, the solution autogenerates database, performs couple of operations afterwhich it will create a backup.

During the restoration process that’s where we encountered the error:

Logical file ‘<database_name>’ is not part of database ‘<database_name>’. Use RESTORE FILELISTONLY to list the logical file names. RESTORE DATABASE is terminating abnormally.

So we take a peek on the generated backup file and discovers that the last letter of logical filename is being truncated. It happens when the logical name of the database is renamed before creating a backup.

You can try it using this script: (it works on SQL Server 2008R2 without any updates):

 USE [master]
GO

CREATE DATABASE [Test]
GO
ALTER DATABASE [Test] SET RECOVERY SIMPLE
GO

— Rename Logical Files and Start Backup
ALTER DATABASE [Test]
MODIFY FILE
(NAME = [Test], NEWNAME=’Test_New’)
GO

ALTER DATABASE [Test]
MODIFY FILE
(NAME = Test_log, NEWNAME=’Test_New_Log’)
GO

— Backup Working Copy
BACKUP DATABASE [Test] TO  DISK = N’E:\Test.bak ‘ WITH NOFORMAT, NOINIT,  NAME = N’Test_New’, SKIP, REWIND, NOUNLOAD,  STATS = 10
GO

Message:

The file name ‘Test_New’ has been set.
The file name ‘Test_New_Log’ has been set.
13 percent processed.
22 percent processed.
31 percent processed.
40 percent processed.
53 percent processed.
62 percent processed.
71 percent processed.
80 percent processed.
94 percent processed.
Processed 176 pages for database ‘Test’, file ‘Test_Ne’ on file 2.
100 percent processed.
Processed 2 pages for database ‘Test’, file ‘Test_New_Lo’ on file 2.
BACKUP DATABASE successfully processed 178 pages in 0.027 seconds (51.486 MB/sec).

 Question is how did the QA guys missed this part,  it was working prior to SQL Server 2008R2.

A fix is released to resolve this issue.

Or with workaround:

To work around this issue, use one of the following two methods:

  • After the logical file name is modified, run the following statements to restart the database:
    ALTER DATABASE <database name> SET OFFLINE
    ALTER DATABASE <database name> SET ONLINE
  • When you modify the logical file name, append a dummy space character in the new logical name. For example, if you want to change the logical name to DB_modified, use the following statement to modify the logical file name:
    ALTER DATABASE <database name> MODIFY FILE
     (NAME=N'<database name>', NEWNAME='DB_modified ') -- Append ' ' to NEWNAME