Syncing SharePoint Artifacts using Templates
This article describes how you can sync the existing Knowledge Documents in your SharePoint site with Luma Knowledge using Knowledge Templates.
The Federated Service Sync with SharePoint creates the artifacts in Luma Knowledge based on the documents using your organization-specific template. With recurring sync, the artifacts are automatically updated and created based on the updates on the SharePoint portal.
The feature is only available for tenants that are available on Serviceaide supported environments.
The Knowledge Template allows the system to produce metadata and FAQs directly from the entire document, eliminating the need for QnA Maker service..
To configure the sync with SharePoint, you should follow the below steps:
- 1 Step 1: Create an Application to connect to your SharePoint site
- 2 Step 2: Configure Source System- Data Source for Sharepoint
- 3 Step 3: Configure Destination system - Data Source for your Luma Knowledge Tenant
- 4 Step 4: Configure the Object and Data Profile to Export data from Source System- Sharepoint
- 5 Step 5: Configure the Object and Data Profile to Import data into the Destination System - Luma Knowledge
- 6 Step 6: Configure the Synchronization Job between SharePoint and Luma Knowledge
- 7 Step 7: Test and Execute Synchronization
Step 1: Create an Application to connect to your SharePoint site
The first step is to create an App principal with access to your SharePoint portal. The App lets Luma Knowledge connect to SharePoint and download documents for artifact creation and updates.
To create the application, follow the below steps:
Navigate to your SharePoint site. For example, https://xxxxxxxxx.sharepoint.com/sites/<SiteName>.
Open the App registration page. Append /_layouts/15/appregnew.aspx to the site address. The URL should look like:
https://xxxxxxxxx.sharepoint.com/sites/<SiteName>/_layouts/15/appregnew.aspxThe App registration page is now open.
The following details are required to create the App:
Client ID: This is the SharePoint App ID. The system autogenerates this. Click Generate.
Client Secret: This is the Password for the App to connect to SharePoint. The system autogenerates this. Click Generate.
Title: Provide a user-friendly display name for the App.
App Domain: This is the Remote server host of the App. Use www.localhost.com if you don't have one.
Redirect URL: This is the Remote application endpoint. Use https://www.localhost.com if you don't have one.
Click Create to create the App.
Your app is now successfully created. Copy and retain the Client Id and Client Secret in a notepad or text editor. The details are required to set up Sharepoint Data Source.
The next step is to provide permission to the app to your Sharepoint site. To do so, follow the below step:
To set permissions for the app, append _layouts/15/appinv.aspx to the site address. The URL should look like:
https://xxxxxxxxx.sharepoint.com/sites/<SiteName>/_layouts/15/appinv.aspxThis will open a new page.
Paste the Client Id (retained in the earlier step) in the App Id field and click Lookup.
This fetches and displays the details of the App Id created earlier.
Provide the Permission Request XML specifying what access the App has. The XML structure is below.
<AppPermissionRequests AllowAppOnlyPolicy="true"> <AppPermissionRequest Scope="http://sharepoint/content/sitecollection" Right="FullControl" /> </AppPermissionRequests>
You may replace the scope with the following values. These should be used as is, do not modify.
TENANT = http://sharepoint/content/tenant
SITE COLLECTION = http://sharepoint/content/sitecollection
SUB SITE = http://sharepoint/content/sitecollection/web
LIST/LIBRARY = http://sharepoint/content/sitecollection/web/list
'Right' may also be populated with the following permissions. These should be used as is, do not modify.
Read = only read access
Write = add/edit/delete
FullControl = full permissions
Note: Based on the Scope selected, the App may require approval from the Azure Global Administrator.
Click Create.
On the confirmation screen, click on Trust It.
Your App is ready to be used.
Now Navigate to the Site Settings → Site App Permissions page to fetch the App identifier for your App. Append ‘appprincipals.aspx?Scope=Web’ to the site address. The URL should look like:
https://xxxxxxxxxx.sharepoint.com/sites/<SiteName>/_layouts/15/appprincipals.aspx?Scope=WebCopy the App Identifier and retain it in a notepad or text editor. The identifier is used in Data source configuration in the next step.
Step 2: Configure Source System- Data Source for Sharepoint
Now that the App is ready, the next step is configuring Sharepoint in Luma Knowledge. Here, we set up the connection type and connection details to the Sharepoint site you want to connect to.
To do so, follow the below steps:
On Luma Knowledge, navigate to Federated Service.
On the Manage Data Sources tab, click on Add Data Source.
Add new Data source details and click on Save.
Once the Data Source is created, click on the new data source to configure the Authorization Profile.
On Configure Authorization Profile page, add the following details:
Select SHAREPOINT as Connection Type.
On the SharePoint Details section, Add your SharePoint Domain.
Click on Get Tenant Id From SharePoint App. This function automatically populates the Identified Sharepoint Tenant Id and Resolved Resource Client Id from SharePoint.
Now configure your Sharepoint App details in Step 1.
Add the App Identifier in Client ID.
Add Client Secret.
Once the details are added, click on Save.
The Sharepoint data source is now ready.
Step 3: Configure Destination system - Data Source for your Luma Knowledge Tenant
Once the Sharepoint Data Source is ready, the next step is configuring the Data source for your Luma Knowledge instance.
The Data Source for your Luma Knowledge instance is available out-of-the-box as 'LUMA KMS Current Host'. You should provide authentication details to connect to your tenant.
To do so, follow the below steps:
Select LUMA KMS Current Host Data Source on the Manage Data Source tab.
On the Configure Authorization Profile page, click Edit.
Scroll down to the Separate HTTP Authentication Call section.
Update the URL to connect to your Luma Knowledge environment. e.g. https://lumaproedge.serviceaide.com/gateway/api/auth/login.
Set HTTP Method to POST.
In Request Headers tab, add the following Keys:
Name : Content-Type Value : application/json Data Type: STRING
Name : ClientType Value : Ajax Data Type: STRING
Name : x-requested-with Value : XMLHttpRequest Data Type: STRING
Name : accept Value : application/json Data Type: STRINGProvide your Tenant authentication details on the Body tab.
Click on Add to add the new key value.
Add the key NAME, e.g. JSON_BODY.
In Value, provide the tenant and authentication details in the below JSON format.
{"username":"<Login username>","password":"<password>","subdomain":"<tenant name>"}
e.g. {"username":"Test","password":"XXXXXXX","subdomain":"Test"}Select JSON as Data Type.
Set Response Type to JSON.
Under Extract Response Using, set JSON path is $.token
Click on Save and update the configuration.
Note: Since Luma Knowledge (as Data source) is available out-of-the-box, a few of the above details are available by default. Please review the data and update as required.
Step 4: Configure the Object and Data Profile to Export data from Source System- Sharepoint
Once the Data Source is ready, the next step is to create Data Object. This represents the data or entities to be imported from SharePoint. This is the actual end-point from where the document should be downloaded.
To configure Data Object, follow the below steps:
On the Manage Data Objects tab, select the newly created Data Source in the Source Name field.
Click on Add Data Object
Add the following details:
Provide Data Object Name.
Select SHAREPOINT as Connection Type.
Select the Sharepoint Data Source Name.
Click Save.
Now select the new Data Object from the list and Configure Data Profile. Here we provide details on the data/documents to be imported from the source. Follow the below steps to configure the Data Profile.
On the Configure Data Profile page, select the Connection Type as SHAREPOINT.
Under the SharePoint Details section, add the following details:
Add Data Object Name.
Provide your Sharepoint Site name as Site Name.
Next, add the Folders to be synced. Click on Add to configure one or more folders.
Once the configuration is done, click on Save to complete the profile.
Once the Data Profile is ready, the next step is to configure Object Fields. These fields represent the information to be imported from the source SharePoint site.
Click on Create Data Objects to view the list of Data Objects for the Data Source. Based on the Connection Type, default Object Fields are automatically added.
You may add more Data Objects as required. To add more Data Object, follow the below steps:
Click on Add Object Fields.
Add Object Name and Data Type.
Click on Save.
The new Data Object is added to the list.
Step 5: Configure the Object and Data Profile to Import data into the Destination System - Luma Knowledge
The next step is to create the Data Object to import Documents into Luma Knowledge.
The Artifact import Data Object is also available out-of-the-box. You may also create your custom Data Object based on your Organization’s needs. Verify the OOTB Data Object configuration and update as required.
Follow the steps to complete the configuration:
On the Manage Data Objects tab, click on Add Data Object.
Add the Data Object Name and Connection details.
The new Data Object is added to the list.
Select the new Data Object to configure the Data Profile.
On Configure Data Profile page, add the following details:
Set Connection Type to REST.
Add the Upload Bulk upload API URL to your Luma Knowledge instance.
e.g. https://lumaproedge.serviceaide.com/gateway/api/v1/bulk/artifacts/uploadSet HTTP Method to POST.
Add the following to Request Headers.
Name : Content-Type Value : multipart/form-data Data Type: STRING
Name : api-access-key Value : <API Access Key> Data Type: STRING
Name : x-requested-with Value : XMLHttpRequest Data Type: STRING
API Access Key is generated on the tenant details screen. For more information, refer to Tenant Details.
Select JSON as the Response Type.
Under Extract Response Using, add '$' to the Enter JSON Path. This enables the system to read the complete response from the Source system (Sharepoint).
Set Is Paged Response to True and Paging Strategy to POST_PAYLOAD.
Click on Save to complete the configuration.
Now click on Create Data Object Fields to view the list of Data Objects for the Data Source. Based on the Connection Type, default Object Fields are automatically added. Review the list and add Data Object fields as required.
Step 6: Configure the Synchronization Job between SharePoint and Luma Knowledge
Once the Data Sources and Data Objects are configured, the next step is configuring the Sync between your Sharepoint site and Luma Knowledge. Here, we map the Data Object in Source-destination systems and configure the frequency at which the data sync is executed.
To create a Synchronization pair, follow the below steps:
Navigate to the Manage Synchronization tab.
Click on the Add Sync Object button.
On the Configure Synchronization screen, add the following details:
Add Sync Job Name.
Add Description.
Set ‘Enable’ to False. Once the sync job is configured and tested, we can enable the job.
Set 'Log Level' to Debug. This ensures the execution logs available for each sync run. The logs can be used to troubleshoot any errors with the sync.
In Push Data To (Data Destination), Select Luma Knowledge Data Source and Data Object information.
In Fetch Data From (Data Source), Select the Sharepoint Data Source and Data Object information.
Map Fields: Map the Data Object Fields in Sharepoint with Data Object Fields in Luma Knowledge. For each Destination Data Object Field, add the mappings as below:
For Destination Data Object artifactJson, click on Configure to add Custom Function. The function enables the system to read the documents from the Sharepoint site and create Artifacts in Luma Knowledge. You may use the following default function or create a new function as per your organization’s requirement and the mappings needed.
Note: Below is the sample function. Update the site URLs as per your Sharepoint website.function getSharepointPageContent(model) { var token = federatedService.getContextParameter("srcAuthToken"); var pageId = model.id; console.log("Page: " + pageId); console.log("Token: " + token); headers = { "Authorization": "Bearer " + token, "ConsistencyLevel": "eventual" }; var pageUrl = "https://graph.microsoft.com/beta/sites/wgsigma.sharepoint.com,5dfb0c63-4e73-49d8-951f-e4c1292c267d,d4fd8b3b-be36-4c77-9287-b57b48c0f8e0/pages/" + pageId + "/microsoft.graph.sitePage?expand=canvasLayout"; var resp = federatedService.invokeRestAPI(pageUrl, "GET", headers, null); var respJson = JSON.parse(resp); var column = respJson.canvasLayout.horizontalSections[0].columns; var content = ""; for (let i = 0; i < column.length; i++) { var parts = column[i].webparts; for (let j = 0; j < parts.length; j++) { content += parts[j].innerHtml; } } var d = [{ "name": "description", "type": "richtext", "label": "Description", "value": model.description, "htmlValue":model.description, "sortOrder": 1, "labelConfig": { "aliases": [] }, "defaultValue": "", "ontologyType": "MAIN", "summaryField": false, "reconciliationKey": false, "usedForRedirectUrl": false }, { "name": "content", "type": "richtext", "label": "Content", "value": content, "hidden": false, "required": true, "htmlValue": content, "sortOrder": 2, "labelConfig": { "aliases": [] }, "defaultValue": "", "ontologyType": "MAIN", "summaryField": false, "reconciliationKey": false, "usedForRedirectUrl": false }, { "name": "link", "type": "text", "label": "Link", "value": "https://wgsigma.sharepoint.com/sites/InfraservKM/" + model.webUrl, "hidden": false, "required": false, "htmlValue": "https://wgsigma.sharepoint.com/sites/InfraservKM/" + model.webUrl, "sortOrder": 3, "labelConfig": { "aliases": [] }, "defaultValue": "", "ontologyType": "NONE", "summaryField": false, "reconciliationKey": false, "usedForRedirectUrl": true } ]; return JSON.stringify(d); }
The above function processes the documents using the Knowledge Template. Luma Knowledge creates an artifact based on the document and generates FAQs from the content. The artifacts are created under the domain 'IT'.
Add the script and update the following fields in the script as required:Update ‘pageUrl’ to the graph URLas per your Sharepoint site .
Update ‘value’ and ‘htmlValue’ to your website.
Update ‘artifactTemplate’ to your Knowledge Template ID.
Click Save.
Set Enable Validation Rules to Disable.
If you want to enable validation rules, set Enable Validation Rules to Enable. You may add a Custom Subroutine to Validation Rules to validate if the data record should be imported.
Once the details are added, Click the Save button.
The Sync Job is now ready.
Step 7: Test and Execute Synchronization
Once the data sync job is configured, you can now enable the sync and execute the job. The data sync can be executed manually or scheduled to run at a defined frequency using a CRON job.
Follow the below steps to execute the sync:
On Manage Synchronization Jobs, select the Sharepoint-Luma Knowledge sync job (created in Step 5).
Click Edit to view the updated details.
Set Enabled to True.
To manually execute the job,
Set the Frequency to One Time and click Save.
Now scroll down to the page and click Execute to trigger the job.
Alternatively, you may also execute the job from the Manage Synchronization Jobs tab. Select the job and click on Execute Sync Pair.
Upon successful completion of the job, you can view the details on Execution Logs.
On the Execution Logs, you can view the status of each execution.
Select the record to view the details on the documents imported and artifacts created. In the below example, 1 document was downloaded from SharePoint and uploaded into Luma Knowledge.
To schedule the job execution at a specific interval, set Frequency to Recurring and add the CRON Job to schedule the automatic sync run. Once scheduled, the job is executed at a specific interval of time.