AFAS is a Dutch business software company, proving companies with ERP and HR software-as-a-service. You can extract data from the service using a REST API (there are also SOAP web services available for those who appreciate the trip down memory lane). In this blog post I’ll show you how you can get data out of AFAS using Azure Data Factory (ADF). Luckily, the process is quite straight forward, in contrast with the ExactOnline API.
To expose the data, you need to define a GetConnector. In such a GetConnector, you define a set of fields (typically related) and optional filters on top of the data model. AFAS will then create a REST endpoint for you to extract the data from the GetConnector. Creating GetConnectors is not the focus of this blog post, I assume they’re already created for your environment. After you’ve created one or more Getconnectors, you need to add them to an App Connector. In an App Connector, you need to create a token for authentication.
The token is used to authenticate against the REST API. You can store it a secure location, such as Azure Key Vault. It needs to be in a specific XML format (probably some backwards compatibility with the old SOAP web services). Suppose your token is “123”. Then the full token should look like this:
<token><version>1</version><data>123</data></token>
Let’s start by creating a linked service in ADF. Search for the REST linked service.
Give the new linked service a name and specify https://xxx.rest.afas.online/profitrestservices/connectors/ as the base URL, where xxx is the member ID of your company in AFAS. Set the authentication type to Anonymous.
Our next step is to create a dataset. Add a REST dataset, give it a name and link it to the linked service we just created.
The BASE URL is provided by the linked service. The Relative URL will specify to which GetConnector we want to connect. You could hardcode a value, but that would mean you would need to create a new dataset for every GetConnector. We rather want to parameterize this, so we can have a metadata-driven pipeline, as explained in the blog post Dynamic Datasets in Azure Data Factory. So let’s create a parameter for the Relative URL:
The idea is to have metadata available that will list all of the GetConnectors and their respective Relative URLs. In the final pipeline, we will loop over this metadata and use it to fetch all of the data from the GetConnectors in parallel using a ForEach activity.
For the Relative URL, we can specify an expression:
But how does such a Relative URL look like? You can construct one using the AFAS developer portal (which requires you to register). To test a Getconnector, we need to log in first by specifying the AFAS member number and the token.
Select a GetConnector from the list to construct the URL.
The Skip and Take URL parameters can be used to fetch only a part of the data. If you use -1 for both, you’ll get the entire data set. This is not recommended for large volumes of data, as it might lead to time-outs or other errors. For large data sets, you might want to implement your own pagination using the skip&take parameters (don’t forget to sort the data as well to get consistent results).
For smaller sets of data, our Relative URL will look like this: GetConnectorName?skip=-1&take=-1. For larger sets, you can either paginate as mentioned before, or you can filter the data. Specifying a filter is a bit special in AFAS, so it’s recommended you use the developer portal to create filter on your GetConnector to get the syntax right. Let’s illustrate with an example. Our GetConnector has a field “Datum” on which we want to filter. It’s a timestamp, and we want to filter between two dates. In SQL, we would write this in a WHERE clause:
Datum >= '2022-01-01T00:00:00' AND Datum < '2022-02-01T00:00:00'
Let’s enter this info in the dev portal:
The URL parameter part (everything after the question mark) will now look like this:
?filterfieldids=Datum%2CDatum&filtervalues=2022-01-01T00%3A00%3A00%2C2022-02-01T00%3A00%3A00&operatortypes=2%2C5&skip=-1&take=-1
Let’s break this down:
Regarding the operators, here are some examples:
A full list can be found in the documentation. The final pipeline will look like this:
The Lookup activity will fetch the metadata from a repository (I opted for a table in Azure SQL DB). This repo contains the different GetConnectors and their respective Relative URLs. The Web Activity places an API call to Azure Key Vault to retrieve the AFAS token. The ForEach activity will loop over the GetConnectors:
Inside the loop, there’s a Copy Data activity will fetch the data from the current GetConnector and dump it into a sink. The source config looks like this:
We map the Relative URL from the metadata to the dataset parameter. We also need to specify an authorization header. This is the AFAS token we retrieved from Azure Key Vault. However, AFAS expect this to be in a base64 format, prefixed with the text “AfasToken ” (don’t forget the space). The full expression:
@concat('AfasToken ',base64(activity('Get Token').output.value))
In the sink you can choose whatever destination you want, as long as it’s compatible with the Json from the source. You can for example write it to a Json file in Azure Blob Storage or Azure Data Lake Storage, or you can flatten the data and write it to a table. Everything in the sink can be parameterized as well:
If you want to flatten the data, ADF expects a mapping of the JSON columns to the tabular columns. Since everything is metadata driven, we don’t want to specify an explicit mapping. The blog post Dynamically Map JSON to SQL in Azure Data Factory explains in detail how you can set this up.
That’s it. When you run the pipeline, ADF will load all the GetConnectors in parallel.
I'm hosting a free webinar at MSSQLTips.com at the 19th of December 2024, 6PM UTC.…
The slides and scripts for my session "Tackling the Gaps & Islands Problem with T-SQL…
Sometimes your Microsoft Entra ID account (formerly known as Azure Active Directory) is added as…
In Azure Data Factory (ADF, but also Synapse Pipelines and Fabric Pipelines), you have a…
I wrote a piece of SQL that had some new T-SQL syntax in it: IS…
I'm very excited to announce I've been selected as a speaker for dataMinds Connect 2024,…
View Comments
Thank you very much, your information helped us with almost getting data from AFAS to your Azure SQL Database. Unfortunately the getconnector is not getting any data (zero results) while I am not getting an error regarding the authentication and the metadata from the destination table Is a copy from the getconnector data.
Do you get data when you do a preview in the copy data activity (in the source config)?
You can also test your getconnector API endpoint here: https://connect.afas.nl/Account/Login?ReturnUrl=%2Frest-json%2Fget-connector
Also make sure there's no filter defined in the getconnector itself. If you go to the definition, do you get data returned in the preview?
Thank you for your reply!
De data from several tested endpoints are tested at connect.afas.nl. In all cases the I was able the check the JSON and tabular content. In one of the tested getconnector I used filters. For testing I also removed the skip and take statement. The connection with the https://******.rest.afas.online/ProfitRestServices/connectors works fine.
Preview data is clickable in the source config but it is asking for an value for "@activity('Get Token').output.value" . But I presume value must come from the 'Get Token' action.
If you know the full token, you should be able to enter it at the preview so you can actually fetch a data set.
Thank you, entering the full token 'unlocks the preview. There seemed to be also an authorisation issue in AFAS. This problem is also solved now. Strangely enough the result of the pipeline is still zero without an error.
But does the preview return data?
Hi Koen,
Im thinking that im missing some knowledge of de ADF. I followed all the steps and completed the copytask. Im a little bit lost when the part with the Lookup, web and foreach. I see that i have created a Metadatacopytask_top level, middlelevel and bottomlevel. Where do i need to to add the lookup, web and foreach? In toplevel for example there already is a foreach.
Thanks in advanced!
You have a lookup that fetches the list of getconnectors you want to extract.
The web activity fetches the access token from a secure location.
The foreach will loop over the list fetched by the lookup. Inside the foreach, there's a copy activity task that downloads the data for a specific connector, using the token from the web activity.