Forgotten Password – How to get from Remote Desktop Connect Manager

After long vacation, celebrated christmas and new year, like most of people I have to go back to work. I’ve checked my emails and check some servers using Remote Desktop Manager. However when I needed to login to Azure Portal, I have to re-enter my credentials.. hmm. Looked at my one note (secure password 🙂 but the password seems to be not working. Went back to RDP Manager, however the password can’t be shown as clear text. Time for google.. How to decrypt the password from Remote Desktop Manager.

By using powershell script below (thanks to this link), you’ll be able to retrieve the password from the Remote Desktop Manager.

# Path to RDCMan.exe
$RDCMan = “C:\Program Files (x86)\Microsoft\Remote Desktop Connection Manager\RDCMan.exe”
# Path to RDG file
$RDGFile = “{<>}”
$TempLocation = “C:\temp”

Copy-Item $RDCMan “$TempLocation\RDCMan.dll”
Import-Module “$TempLocation\RDCMan.dll”
$EncryptionSettings = New-Object -TypeName RdcMan.EncryptionSettings

$XML = New-Object -TypeName XML
$XML.Load($RDGFile)
$logonCredentials = Select-XML -Xml $XML -XPath ‘//logonCredentials’

$Credentials = New-Object System.Collections.Arraylist
$logonCredentials | foreach {
[void]$Credentials.Add([pscustomobject]@{
Username = $_.Node.userName
Password = $(Try{[RdcMan.Encryption]::DecryptString($_.Node.password, $EncryptionSettings)}Catch{$_.Exception.InnerException.Message})
Domain = $_.Node.domain
})
} | Sort Username

$Credentials | Sort Username

 

How to deploy Azure Logic Apps using Azure DevOps

If you’re looking answer for the following:

  • How to use Azure DevOps to deploy Azure Logic Apps?
  • How to do Continuous deployment of Azure Logic Apps using Azure DevOps?
  • How to develop Azure Logic Apps using Visual Studio and deploy it using AzureDevops?

You’re in the right place.

In this demonstration, I’ll be using the following tools / services

  • Visual Studio 2017 with Logic Apps Tools 
  • AzureDevOps (as of writing, they’re changing the product every 3 weeks)

Walkthough

  • Create Azure Logic Apps using Visual Studio 2017
    • Create multiple parameters per environment
  • Azure DevOps setup
    • Setup of AzureDevOps Service Connections
    • Setup of AzureDevOps Pipelines
    • Setup of AzureDevOps Release

Create Azure Logic Apps using Visual Studio 2017

  1. Open Visual Studio and connect to your AzureDevOps Instance
  2.  Once connected, Click File -> New Project -> Cloud -> Azure Resource Group

LogicAppProject

3. Select Logic app template and click next.

LogicAppTemplate

4. Open the logic app by Right Clicking the LogicApp.json -> Select Open With Logic App Designer.

LogicAppOpenDesigner

5. It will prompt the Logic App Properties, from here you’ll have to select the Subscription and resource group.

LogicAppSubscription

6. Develop the Logic App. You can copy from the list of template already available.

7. Create a copy of parameters file. In this example, I’ll create 3 different parameters files (dev, test and production). In the parameters file, i’ll use different logic app name per environment. The param file will be used later on in the AzureDevOps Release.

LogicAppsParams

8. Checkin the source code to Azure DevOps and later on we will link the Azure Pipeline to the source control.

Once this is done, we’re ready to configure and deploy using Azure DevOps.

Azure DevOps Setup

  1. Setup the Service Connection, this is needed when we configure the Release.
  2. Go to AzureDevOps project -> Click the Configure -> Pipelines -> Service Connections -> New Service Connection. Select Add an Azure Resource Manager service connectionAzureDevOpsServiceConnection
  3. Add a connection name. For the first example you can name it Azure Development. Create 2 more connections for Test and Production. In all cases, make sure to select a different resrouce group.
  4. Setup the Azure DevOps Pipelines, nothing special just do a Build of solution and Publish of artifact.AzurePipelineSetup
  5. Setup the Azure DevOps Release, overall it looks like this. and the configuration is repeating every stage (see #7). AzureReleaseView
  6. Development Stage Setup. You only need the Create Or Update Resource Group. Select the correct subscription (that we created in #1) and the template parameters (dev.parameters.json = Development)AzureReleaseSetup
  7. Clone the Development Stage to Test and Production.
  8. Only thing needs to be changed is the Subscription and the parameters.json file.

 

Testing

  1. Provided that you’ve setup the pipeline to do Continuous integration (Triggers-> Enable Continous integration). The moment you make changes to logic app and checkin the code. The Pipeline will be triggered.
  2. Once Pipeline succeeded, you can create a Release from it. This is how it looks like:AzureReleaseSuccesful

 

References:

http://www.integrationusergroup.com/continuous-integration-logic-apps-using-team-foundation-team-services/

 

 

 

 

 

 

 

 

 

BizTalk Map – Eliminate duplicate records using functoid

Recently there’s a requirement to eliminate the duplicate records in the output using BizTalk map.

Brilliant solution can be found here:

I’ve added step by step on how to do it.

Overall solution it looks like this:

Duplicate

You’ll need 2 script functiods and Equal logical functoid

  1. Top functoid will contain the declaration of variable with following code:

System.Collections.Specialized.StringCollection uniqueIds = new System.Collections.Specialized.StringCollection() ;

2.  Another functoid will take in the ID of source messsage wherein the uniqueness will be checked. In this case, CustomerId as input. The following code is as follows:

public bool IsUniqueId(string id)
{
if (uniqueIds.Contains(id))
return false;
uniqueIds.Add(id);
return true;
}

3.  Equal logical functoid will take the output of the second script functoid and output should be map to the target record

 

This is my input message:

http://BizTalk_Server_Project1.Order”>;

1
Customer 1


1
Deplicate Customer


2
Customer 2

And this is the output:

http://BizTalk_Server_Project1.Order”>;

1

Customer 1

-

2

Customer 2

 

Viola!!, without using xslt you can achieve same results.

Azure API Management – How to set basic authentication in SendRequest

Scenario: I have both username and password stored in name value configuration of API management but I need to use it in SendRequest policy.

Source Code:

<set-variable name=“userName” value=“{{username}}” />

<set-variable name=“password” value=“{{password}}” />

<set-variable name=“basicAuthDetails” value=”@{

var username = context.Variables.GetValueOrDefault<string>(“userName”);

var password = context.Variables.GetValueOrDefault<string>(“password”);

return System.Convert.ToBase64String(System.Text.Encoding.GetEncoding(“ISO-8859-1”).GetBytes(username + “:” + password));

} />

<set-variable name=“jsonPayload” value=”@{

JObject transBody = new JObject();

//Add all json Property

transBody.Add(“test”, JToken.FromObject(new[]

{

 “test data”

}));

} />

<send-request mode=“new” response-variable-name=“var” ignore-error=“false”>

<set-url>@{

        var url = context.Api.ServiceUrl+ “/{someURL}”;

       return url;

}set-url>

<set-method>POSTset-method>

<set-header name=“Authorization” exists-action=“override”>

<value>@(context.Variables.GetValueOrDefault<string>(“basicAuthDetails”))value>

set-header>

<set-header name=“Content-Type” exists-action=“override”>

<value>application/jsonvalue>

set-header>

<set-body template=“none”>@(context.Variables.GetValueOrDefault<string>(“jsonPayload”))set-body>

send-request>

 

That is how you can set basic authentication in SendRequest in Azure API Management.

Azure API Management Policy – Asynchronous API as Synchronous API

Below is the link to reference/examples of using Azure API management policy

There’s an example there on how to Mask Asynchronous calls as Synchronous API however for my requirement it’s not enough.

The target API behaves as follow:

  1. All operations are POST and asynchronous.
  2. To get the results of any operation a second API needs to be called and it’s expecting the transaction id.
  3. API expects a payload in command manner (args parameter) instead of proper JSON name/value pair.

Requirement: API should return the results in synchronous manner.

Solution:

API Callout Policy in API Management.

Steps: 

  1. Create a GET operation and via policy rewrite the operation to POST
  2. Create a JSON transformation logic in the inbound policy and execute the backend API.
  3. Create a JSON transformation logic in outbound policy and add a retry policy to execute the second API  that returns the result passing the transaction id from #2
  4. Return the results from second API.

Policy Code:





application/json




@{
var paramFromRequest = Uri.UnescapeDataString(context.Request.OriginalUrl.Query.GetValueOrDefault("paramFromRequest"));
JObject transBody = new JObject();
transBody.Add("source", 
     new JObject
     {
         {"someproperty", "fixvalue"},
         {"someproperty2", "fixvalue2"},
     });

//Add all json properties as arg
transBody.Add("args", JToken.FromObject(new[] 
{ 
      paramFromRequest
}));
return transBody.ToString();
}
@(context.Request.Body.As(true))


POST












@{ 
var url = context.Api.ServiceUrl+ "{{SECONDBACKENDAPI_URL}}";
return url;
}
POST

@(context.Variables.GetValueOrDefault("authorization"))


application/json

@(context.Variables.GetValueOrDefault("jsonPayload"))

())" />


@((context.Variables.GetValueOrDefault("results")["{{PROPERTYNAME IN JSON OBJECT THAT CONTAINS THE RESULT}}"].ToString()))





 

Querying Azure SQL Database using Azure Functions 2.0 to return JSON data

The guide below shows how you can easily query Azure SQL Database using Azure Functions.

I have to admit, I have to do multiple google search and combine it for a working solution.

Challenges:

  1. How to get SQL connectionString from Azure Function settings. https://docs.microsoft.com/en-us/azure/azure-functions/functions-scenario-database-table-cleanup
  2. How to convert the sql results to JSON.  https://stackoverflow.com/questions/5083709/convert-from-sqldatareader-to-json

Steps:

    • 1. Create a new Function with HTTP trigger
    • 2. Add a new file called serialize.csx with following contents below. This will convert the SQL rows to a JSON like data.
using System.Text;
using System.Data;
using System.Linq;
using System.Configuration;
using System.Data.SqlClient;
using System.Collections.Generic;
using System.Collections;
public static IEnumerable> Serialize(SqlDataReader reader)
{
var results = new List>();
var cols = new List();
for (var i = 0; i < reader.FieldCount; i++)
{
var colName = reader.GetName(i);
var camelCaseName = Char.ToLowerInvariant(colName[0]) + colName.Substring(1);
cols.Add(camelCaseName);
}

while (reader.Read())
results.Add(SerializeRow(cols, reader));

return results;
}
private static Dictionary SerializeRow(IEnumerable cols,
SqlDataReader reader) {
var result = new Dictionary();
foreach (var col in cols)
result.Add(col, reader[col]);
return result;
}

 

3. In the run.csx, paste the following code. This will query the Azure SQL database and returns the data.

#r "Newtonsoft.Json"
#load "serialize.csx"

using System.Net;

using Microsoft.AspNetCore.Mvc;

using Microsoft.Extensions.Primitives;

using Newtonsoft.Json;

using System.Text;

using System.Data;

using System.Linq;

using System.Configuration;

using System.Data.SqlClient;

using System.Collections.Generic;

public static async Task Run(HttpRequest req, ILogger log)

{

log.LogInformation("C# HTTP trigger function processed a request.");

string name = req.Query["name"];

string json =" ";

try

{

var str = Environment.GetEnvironmentVariable("");




using(SqlConnection conn =new SqlConnection(str))

{

using(SqlCommand cmd =new SqlCommand())

{

SqlDataReader dataReader;

cmd.CommandText = "";

cmd.CommandType = CommandType.Text;

cmd.Connection = conn;

conn.Open();

dataReader = cmd.ExecuteReader();

var r = Serialize(dataReader);

json = JsonConvert.SerializeObject(r, Formatting.Indented);

}

}

}

catch(SqlException sqlex)

{

log.LogInformation(sqlex.Message);

log.LogInformation(sqlex.ToString());

returnnew HttpResponseMessage(HttpStatusCode.BadRequest)

{

Content = new StringContent(JsonConvert.SerializeObject($"The following SqlException happened: {sqlex.Message}"), Encoding.UTF8, "application/json")

};

}

catch(Exception ex)

{

log.LogInformation(ex.Message);

log.LogInformation(ex.ToString());

returnnew HttpResponseMessage(HttpStatusCode.BadRequest)

{

Content = new StringContent(JsonConvert.SerializeObject($"The following SqlException happened: {ex.Message}"), Encoding.UTF8, "application/json")

};

}

returnnew HttpResponseMessage(HttpStatusCode.OK)

{

Content = new StringContent(json, Encoding.UTF8, "application/json")

};

}
This shows how you can easily query the Azure SQL Database and return the data JSON in few minutes.
Perhaps next steps is to deploy this Azure Function into API management.

 

SAP – How to generate an XSD/IDoc Schema

Below are the steps on how to generate an XSD Schema / IDoc Schema from SAP.

1. In the SAP System in which the IDoc is defined, call transaction WE60 (IDoc Documentation).

2. Enter the name of the IDoc in the Basic type field (or the Enhancement field if your IDoc is an Enhancement of a standard SAP IDoc).

3. Choose menu Documentation, XML Schema.

4. Respond Yes, if prompted to ‘Generate Documentation for Unicode File’.

5. On the resulting screen containing the XSD of the IDoc, choose menu XML, Download.

SSIS 2012 to SSIS 2016 Migration – Arithmetic operation resulted in an overflow

We’ve recently upgraded our SSIS 2012 to SSIS 2016, during the package upgrade it goes without issue. In Visual Studio, there’s no problem running it, however when we use the SSIS runtime we’ve encountered the following error:

“Arithmetic operation resulted in an overflow.”. Possible failure reasons: Problems with the query, “ResultSet” property not set correctly, parameters not set correctly, or connection not established correctly.

The Execute method on the task returned error code 0x80131516 (Arithmetic operation resulted in an overflow.). The Execute method must succeed, and indicate the result using an “out” parameter.

To solve this issue,  I’ve recreated the SQL Task and viola no error!.

 

 

SSIS Tips & Tricks Best Practice

I’ve been developing SSIS for quite sometime now and below are some tips and tricks I’ve used to make solution to be more consistent, fast and efficient.

1. Shape names (tranform shapes), use prefix, I use a 3 letter acronym. For ex. Execute SQL Task name it like EST Update Status.

2. Do use  Sequence Container to group related task/shapes.

3. Variables 

3.1 Limit the scope, create the variables closer to the shape that would use it.

3.2 Use Pascal casing for variables that will be configurable in dtsConfig and camel casing for the rest. This will make your life easier, by the time you’re configuring the dtsConfig it will be easy to identify which ones are to be included.

3.3 Variable access – instead of manually adding ReadOnlyVariables and ReadWrite Variables on the Script Task, you can use the script below to access and set the values of the variables without having to encounter the nasty variable is locked error.

Private Function GetValue(ByVal variableName As String) As Object
Dim var As Variables
Dim objVal As Object
Dim varName As String = "User::" & variableName
Dts.VariableDispenser.LockForRead(varName)
Dts.VariableDispenser.GetVariables(var)
objVal = var(varName).Value
var.Unlock()
GetValue = objVal
End Function

Private Sub SetValue(ByVal variableName As String, ByVal value As Object)
Dim var As Variables
Dim varName As String = "User::" & variableName
Dts.VariableDispenser.LockForWrite(varName)
Dts.VariableDispenser.GetVariables(var)
var(varName).Value = value
var.Unlock()
End Sub
End Class

4. Error Handling –  Use precedance constraint to handle failure instead of using Event Handlers.  The reason for this is that, it’s more simple, clean and easy to spot. 🙂 Also, sometimes using event handler will give you weird cryptic error that is both waste of time and unnerving.

5. Package properties–  Set the Protectionlevel to DontSaveSensitive and SuppressConfigurationWarning to True.

Setting the protection level will save you from:

Failed to decrypt protected XML node “DTS:Password” with error 0x8009000B

this is assuming that you are using Integrated Security or will going to use SQL Configuration Table see Item# 8.

SuppressConfigurationWarning, just like what the property indicates. This maybe because the location of dtsConfig in your production is different from your virtual or dev machine.

6. Bulk Update – You might be tempted to use a OLEDB Command to perform an update, but this is the slowest possible way since it means that this command will be executed multiple times based on the total number of rows.

Faster way is to create a temporay table on the target database, populate that table then do an execute sql task performs the update. Aside from the normal db_datareader/writer permission you’ll need a db_ddladmin permission to drop and create tables.

UPDATE d
d.Column1 = t.Column1,
d.Column2 = t.Column2
FROM DestinationTable d
INNER JOIN TmpTable t ON (d.Key = t.Key)

7. For parallel data processing you can use the Balanced Data Distributor for SSIS, this component can be downloaded seperately from
Microsoft.

For the quick demo click here.

To download click here

8. If the number of SSIS projects will grow overtime especially when connecting to different sources maintaining the connectionString will be hard if not centralized. To implement centralize configuration for SSIS see this article:  https://randypaulo.wordpress.com/2011/12/02/ssis-centralize-connection-repository/

Lastly, make sure that connecting lines between task is straight :), it just look more professional and nice if it is.

Happy coding..