App B: Networking |
- App A -> App B: 403 Forbidden. From anywhere else -> App B: 200.
- App A -> App C: 200
App B: Networking |
Recently I was updating an old .NET Core web application to .NET 8 and the code was reading a certificate as follows.
private X509Certificate2 GetCertificateByThumbprint(string thumbprint)
{
X509Store store = new (StoreName.My, StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
X509Certificate2Collection certificateCollection = store.Certificates.Find(X509FindType.FindByThumbprint, thumbprint, true);
return certificateCollection.OfType<X509Certificate2>().SingleOrDefault();
}
This piece of code wasn't working once the application is deployed to Azure App Service (Windows). The certificate is set up in App Service, but the code wasn't picking it up. As usual, QAs were insisting it used to work.
It seems I needed to add an app setting WEBSITE_LOAD_CERTIFICATES with the value of comma-separated certificate thumbprints in order for them be loaded and accessible from App Service code.
{
"name": "WEBSITE_LOAD_CERTIFICATES",
"value": "<comma-separated-certificate-thumbprints>",
"slotSetting": false
}
You can read more on Use a TLS/SSL certificate in your code in Azure App Service. It contains instructions for other scenarios like loading a certificate from a file and loading a certificate in Linux/Windows containers.
Hope this helps.
Happy Coding.
Regards,
Jaliya
In this post, let's see how to call an external API using Client Credentials in an Azure AD B2C User Journey.
I am assuming Azure AD B2C App Registration is already set up for the client app with the necessary permission (scope access) to call the protected API and you have noted down the Client ID, Client Secret, and the Scope.
Note: There are no additional actions to enable the client credentials for user flows or custom policies. Both Azure AD B2C user flows and custom policies support the client credentials flow by default. But of course, you can create a custom policy to customize the user journey of the OAuth 2.0 Client credentials and extend the token issuance process.
First, you can test that everything is set up correctly using the following Powershell script.
$clientId = "<clientId>"
$clientSecret = "<clientSecret>"
$endpoint = "https://<tenant-name>.b2clogin.com/<tenant-name>.onmicrosoft.com/<policy>/oauth2/v2.0/token"
$scope = "<scope>"
$body = "grant_type=client_credentials&scope=" + $scope + "&client_id=" + $clientId + "&client_secret=" + $clientSecret
$token = Invoke-RestMethod -Method Post -Uri $endpoint -Body $body
$token | ConvertTo-Json
Here the scope is something like follows:
$scope = "https://<tenant-name>.onmicrosoft.com/45a2252d-099a-4c6a-9c57-66eac05e2693/.default"
Test Client Credentials |
1. Define a ClaimType for access_token.
<BuildingBlocks>
<ClaimsSchema> ...
<ClaimType Id="access_token">
<DisplayName>Access Token</DisplayName>
<DataType>string</DataType>
</ClaimType>
</ClaimsSchema> ... </BuildingBlocks>
2. Define TechnicalProfiles to retrieve access_token and to call the external API using the retrieved access_token.
<ClaimsProvider>
...
<TechnicalProfiles>
<TechnicalProfile Id="REST-GetClientCredentials">
<DisplayName>Get Client Credentials</DisplayName>
<Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.RestfulProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"/>
<Metadata>
<Item Key="ServiceUrl">
https://<tenant-name>.b2clogin.com/<tenant-name>.onmicrosoft.com/<policy>/oauth2/v2.0/token?grant_type=client_credentials&scope=<scope>&client_id=<clientId>&client_secret=<clientSecret>
</Item>
<Item Key="SendClaimsIn">Body</Item>
<Item Key="AuthenticationType">None</Item>
<Item Key="AllowInsecureAuthInProduction">true</Item>
</Metadata>
<OutputClaims>
<OutputClaim ClaimTypeReferenceId="access_token"/>
</OutputClaims>
<UseTechnicalProfileForSessionManagement ReferenceId="SM-Noop"/>
</TechnicalProfile>
<TechnicalProfile Id="REST-CallApiUsingClientCredentials">
<DisplayName>Call an External API using Client Credentials</DisplayName>
<Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.RestfulProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" />
<Metadata>
<Item Key="ServiceUrl"><Endpoint to call></Item>
<Item Key="SendClaimsIn">Header</Item>
<Item Key="AuthenticationType">Bearer</Item>
<Item Key="UseClaimAsBearerToken">access_token</Item>
<Item Key="AllowInsecureAuthInProduction">true</Item>
<Item Key="IncludeClaimResolvingInClaimsHandling">true</Item>
</Metadata>
<InputClaims>
<InputClaim ClaimTypeReferenceId="access_token"/>
</InputClaims>
<OutputClaims>
<!-- Output Claims from Calling the API -->
</OutputClaims>
<UseTechnicalProfileForSessionManagement ReferenceId="SM-Noop" />
</TechnicalProfile> ...
</TechnicalProfiles>
</ClaimsProvider>
3. Finally, introduce additional OrchestrationSteps to your UserJourney to use the above TechnicalProfiles.
<UserJourneys>
<UserJourney Id="<UserJourneyId>">
<OrchestrationSteps>
...
<OrchestrationStep Order="7" Type="ClaimsExchange">
<ClaimsExchanges>
<ClaimsExchange Id="RESTGetClientCredentials" TechnicalProfileReferenceId="REST-GetClientCredentials" />
</ClaimsExchanges>
</OrchestrationStep>
<OrchestrationStep Order="8" Type="ClaimsExchange">
<ClaimsExchanges>
<ClaimsExchange Id="RESTCallApiUsingClientCredentials" TechnicalProfileReferenceId="REST-CallApiUsingClientCredentials" />
</ClaimsExchanges>
</OrchestrationStep> ...
<OrchestrationStep Order="11" Type="SendClaims" CpimIssuerTechnicalProfileReferenceId="JwtIssuer" />
</OrchestrationSteps>
</UserJourney>
Now that should be it.
Hope this helps.
Happy Coding.
using Polly;
using System.Diagnostics;
using System.Runtime.InteropServices;
namespace HelloAzureFunctions.Tests.Integration.Fixtures;
public class AzureFunctionFixture : IDisposable
{
private readonly string _path = Directory.GetCurrentDirectory();
private readonly string _testOutputPath = Path.Combine(Directory.GetCurrentDirectory(), "integration-test-output.log");
private readonly int _port = 7071;
private readonly string _baseUrl;
private readonly Process _process;
public readonly HttpClient HttpClient;
public AzureFunctionFixture()
{
_baseUrl = $"http://localhost:{_port}";
HttpClient = new HttpClient()
{
BaseAddress = new Uri(_baseUrl)
};
if (File.Exists(_testOutputPath))
{
File.Delete(_testOutputPath);
}
DirectoryInfo directoryInfo = new(_path);
_process = StartProcess(_port, directoryInfo);
_process.OutputDataReceived += (sender, args) =>
{
File.AppendAllLines(_testOutputPath, [args.Data]);
};
_process.BeginOutputReadLine();
}
public void Dispose()
{
if (!_process.HasExited)
{
_process.Kill(entireProcessTree: true);
}
_process.Dispose();
HttpClient.Dispose();
}
public async Task WaitUntilFunctionsAreRunning()
{
PolicyResult<HttpResponseMessage> result =
await Policy.TimeoutAsync(TimeSpan.FromSeconds(30))
.WrapAsync(Policy.Handle<Exception>().WaitAndRetryForeverAsync(index => TimeSpan.FromMilliseconds(500)))
.ExecuteAndCaptureAsync(() => HttpClient.GetAsync(""));
if (result.Outcome != OutcomeType.Successful)
{
throw new InvalidOperationException("The Azure Functions project doesn't seem to be running.");
}
}
private static Process StartProcess(int port, DirectoryInfo workingDirectory)
{
string fileName = "func";
string arguments = $"start --port {port} --verbose";
if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
{
fileName = "powershell.exe";
arguments = $"func start --port {port} --verbose";
}
ProcessStartInfo processStartInfo = new(fileName, arguments)
{
UseShellExecute = false,
CreateNoWindow = true,
RedirectStandardOutput = true,
WorkingDirectory = workingDirectory.FullName,
EnvironmentVariables =
{
// Passing an additional environment variable to the application,
// So it can control the behavior when running for Integration Tests
[ApplicationConstants.IsRunningIntegrationTests] = "true"
}
};
Process process = new() { StartInfo = processStartInfo };
process.Start();
return process;
}
}
name: Run Integration Tests
on:
push:
branches: ["main"]
paths-ignore:
- '**.md'
env:
DOTNET_VERSION: '8.0.x'
jobs:
build-and-test:
strategy:
matrix:
os: [ubuntu-latest, windows-latest]
runs-on: ${{ matrix.os }}
env:
INTEGRATION_TEST_EXECUTION_DIRECTORY: ./tests/HelloAzureFunctions.Tests.Integration/bin/Debug/net8.0
steps:
- name: 'Checkout GitHub Action'
uses: actions/checkout@v3
- name: Setup .NET ${{ env.DOTNET_VERSION }} Environment
uses: actions/setup-dotnet@v3
with:
dotnet-version: ${{ env.DOTNET_VERSION }}
- name: Build
run: dotnet build
# Install Azure Functions Core Tools in the runner,
# so we have access to 'func.exe' to spin up the Azure Functions app in integration tests
- name: Install Azure Functions Core Tools
run: |
npm install -g azure-functions-core-tools@4 --unsafe-perm true
# Setup Azurite in the runner,
# so the Azure Functions app we are going to spin up, can use azurite as it's Storage Provider
- name: Setup Azurite
shell: bash
run: |
npm install -g azurite
azurite --silent &
- name: Run Integration Tests
# If there are any errors executing integration tests, uncomment the following line to continue the workflow, so you can look at integration-test-output.log
# continue-on-error: true
run: dotnet test ${{ env.INTEGRATION_TEST_EXECUTION_DIRECTORY }}/HelloAzureFunctions.Tests.Integration.dll
- name: Upload Integration Tests Execution Log
uses: actions/upload-artifact@v4
with:
name: artifact-${{ matrix.os }}
path: ${{ env.INTEGRATION_TEST_EXECUTION_DIRECTORY }}/integration-test-output.log
[Function(nameof(ServiceBusTrigger))]
public static void ServiceBusTrigger( [ServiceBusTrigger("%Messaging:Topic%", "%Messaging:Subscription%")] ServiceBusReceivedMessage serviceBusReceivedMessage)
{
// TODO: Process the received message
}
Azure App Configuration Values |
@Microsoft.AppConfiguration(Endpoint=https://aac-temp-001.azconfig.io; Key=<key>)
// if you want to choose a particular Label
@Microsoft.AppConfiguration(Endpoint=https://aac-temp-001.azconfig.io; Key=<key>; Label=<label>)
Function App Configuration |
C:\vsts-agent\_work\_tool\dotnet\sdk\5.0.405\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.TargetFrameworkInference.targets(141,5):
error NETSDK1045: The current .NET SDK does not support targeting .NET 8.0.
Either target .NET 5.0 or lower, or use a version of the .NET SDK that supports .NET 8.0. [C:\vsts-agent\_work\67\s\xxxxx.csproj]
- task: UseDotNet@2
displayName: Use .NET
inputs:
packageType: 'sdk'
version: '8.0.x'
{
...
"variables": {
"DOTNET_MSBUILD_SDK_RESOLVER_SDKS_DIR": {
"value": "C:\\vsts-agent\\_work\\_tool\\dotnet\\sdk\\5.0.405\\Sdks"
},
"DOTNET_MSBUILD_SDK_RESOLVER_SDKS_VER": {
"value": "5.0.405"
},
...
}
...
}
Passing variables to the pipeline execution |
variables:
- name: DOTNET_MSBUILD_SDK_RESOLVER_SDKS_DIR
value: 'C:\vsts-agent\_work\_tool\dotnet\sdk\8.0.101\Sdks'
- name: DOTNET_MSBUILD_SDK_RESOLVER_SDKS_VER
value: '8.0.101'
...
I had a requirement where I wanted to do an additional validation on a boolean claim value in an AAD B2C user journey. If the boolean claim value is true, I wanted to move forward in the user journey. If the value is false, I wanted to short circuit the user journey and return an error.
I couldn't use Validation Technical Profiles, because the output claim I am validating upon was in a non-self-asserted technical profile (the claim was retrieved by calling an external REST endpoint) and Validation Technical Profiles doesn't support non-self-asserted technical profiles.
In such cases, we can add an additional OrchestrationStep, do a Precondition in that particular step, assert and navigate the user to a self-asserted technical profile and display the error there.
So how do we do that?
1. Define a ClaimType for a self-asserted technical profile.
<BuildingBlocks>
<ClaimsSchema>
...
<ClaimType Id="errorMessage">
<DisplayName>Please contact support.</DisplayName>
<DataType>string</DataType>
<UserInputType>Paragraph</UserInputType>
</ClaimType>
</ClaimsSchema>
...
</BuildingBlocks>
2. Define a ClaimsTransformation.
<BuildingBlocks>
...
<ClaimsTransformations> ...
<ClaimsTransformation Id="CreateApplicationUserNotActiveErrorMessage" TransformationMethod="CreateStringClaim">
<InputParameters>
<InputParameter Id="value" DataType="string" Value="Application user is not active." />
</InputParameters>
<OutputClaims>
<OutputClaim ClaimTypeReferenceId="errorMessage" TransformationClaimType="createdClaim" />
</OutputClaims>
</ClaimsTransformation>
</ClaimsTransformations>
</BuildingBlocks>
3. Define a self-asserted TechnicalProfile. Use the above ClaimsTransformation as a InputClaimsTransformation. Reference the ClaimType created in the first step.
<ClaimsProviders>
<ClaimsProvider>
<DisplayName>...</DisplayName>
<TechnicalProfiles> ...
<TechnicalProfile Id="SelfAsserted-ApplicationUserNotActiveError">
<DisplayName>Error message</DisplayName>
<Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.SelfAssertedAttributeProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null"/>
<Metadata>
<Item Key="ContentDefinitionReferenceId">api.selfasserted</Item>
<Item Key="setting.showContinueButton">false</Item>
<Item Key="setting.showCancelButton">true</Item>
</Metadata>
<InputClaimsTransformations>
<InputClaimsTransformation ReferenceId="CreateApplicationUserNotActiveErrorMessage" />
</InputClaimsTransformations>
<InputClaims>
<InputClaim ClaimTypeReferenceId="errorMessage"/>
</InputClaims>
<OutputClaims>
<OutputClaim ClaimTypeReferenceId="errorMessage"/>
</OutputClaims>
</TechnicalProfile>
</TechnicalProfiles>
</ClaimsProvider>
</ClaimsProviders>
4. Introduce an additional OrchestrationStep with a Precondition before the last the OrchestrationStep. If the condition is not satisfied, use the created self-asserted TechnicalProfile.
<UserJourneys> ...
<UserJourney Id="...">
<OrchestrationSteps>
...
<OrchestrationStep Order="9" Type="ClaimsExchange">
<Preconditions>
<Precondition Type="ClaimEquals" ExecuteActionsIf="true">
<Value>isActive</Value> <!-- this claim is forwarded from a previous step -->
<Value>True</Value>
<Action>SkipThisOrchestrationStep</Action>
</Precondition>
</Preconditions>
<ClaimsExchanges>
<ClaimsExchange Id="SelfAssertedApplicationUserNotActiveError" TechnicalProfileReferenceId="SelfAsserted-ApplicationUserNotActiveError" />
</ClaimsExchanges>
</OrchestrationStep>
...
<OrchestrationStep Order="11" Type="SendClaims" CpimIssuerTechnicalProfileReferenceId="JwtIssuer" />
</OrchestrationSteps>
</UserJourney>
...
</UserJourneys>
Happy Coding.
In this post let's see how we can preserve Stack<T> order when it's getting passed between Orchestrators/Activities in a .NET Isolated Azure Durable Function.
In Durable Functions in the .NET isolated worker, the Serialization default behavior has changed from Newtonsoft.Json to System.Text.Json.
I have already written a post about preserving Stack Order in an In-Process Azure Durable Functionshere. I am using the same code example, instead converted it to isolated worker. So I am not going to write down the entire example code to describe the issue here, you can have a look at the previous post.
You can see in the below screenshot, the order of Stack<T> is not preserved with default Serializer options.
|
using DurableFunctions.Isolated.StackSerialization.Converters;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using System.Text.Json;
IHost host = new HostBuilder()
.ConfigureFunctionsWorkerDefaults()
.ConfigureServices(services =>
{
services.Configure<JsonSerializerOptions>(options =>
{ // Add custom converter to serialize and deserialize a Stack<T>
options.Converters.Add(new JsonConverterFactoryForStackOfT());
});
})
.Build();
host.Run();
Correct Result |
Hope this helps.
Happy Coding.