Role based embedded reports in data studio
- General
Role based embedded reports in data studio
Multiple times we need to build reports to present our data. The Google Data Studio provides a powerful tool to build your own reports with less time and without any coding knowledge.
Google Studio depends on Data Source. The data source is a connection or method, which can be used to connect to our database.
There are multiple types of connectors available on the google data source and we can also build our own connector.
Inbuilt data source provider https://datastudio.google.com/u/0/datasources/create. Here we can use data sources, according to our requirements.
First, we are using Mysql connector:
Step 1: First, add details to our database. Add Click on Authentication. If all details are correct, it will be connected to our database.
Step 2: There are 2 sections. a) You can select a table that will report the data needed. b) You can write own query. Then click on the connect button.
Step 3: According to the query, the column will show there. You can create an explorer report.
https://drive.google.com/file/d/17RvzBabIrOdT5CX8yZwdlFAWYbsxxify/view?usp=sharing
These connectors have a problem. Suppose we have a database of the HR Department. So the Administrator can view all departments’ data, but the Head of a department allows the only view of his department data.
In that case, we need to create a custom data source provider.
Step 1: We need to write a custom data source to https://script.google.com/. First, edit the Application.json. Here we have to update the manifest JSON and Code.gs script.
Application.json
{ "timeZone": "GMT", "dependencies": { }, "runtimeVersion": "V8", "dataStudio": { "name": "Custom JSON Connect", "logoUrl": "https://www.aurigait.com/resources/themes/ait-child/assets/ait/images/auriga_logo_white.svg", "company": "Auriga It", "companyUrl": "https://aurigait.com/", "addonUrl": "https://aurigait.com/", "supportUrl": "https://aurigait.com/", "description": "Free JSON connector. Fetch data from any JSON data source by URL. Use caching to speed up rendering.", "sources": ["CUSTOM_JSON"], "shortDescription": "Connect to JSON data by URL", "authType": ["NONE"], "feeType": ["FREE"] } }
Code.gs
/** * Throws and logs script exceptions. * * @param {String} message The exception message */ function sendUserError(message) { var cc = DataStudioApp.createCommunityConnector(); cc.newUserError() .setText(message) .throwException(); } /** * function `getAuthType()` * * @returns {Object} `AuthType` used by the connector. */ function getAuthType() { return {type: 'NONE'}; } /** * function `isAdminUser()` * * @returns {Boolean} Currently just returns false. Should return true if the current authenticated user at the time * of function execution is an admin user of the connector. */ function isAdminUser() { return false; } /** * Returns the user configurable options for the connector. * * Required function for Community Connector. * * @param {Object} request Config request parameters. * @returns {Object} Connector configuration to be displayed to the user. */ function getConfig(request) { var cc = DataStudioApp.createCommunityConnector(); var config = cc.getConfig(); var option1 = config .newOptionBuilder() .setLabel('Text') .setValue('text'); var option2 = config .newOptionBuilder() .setLabel('Inline') .setValue('inline'); config .newInfo() .setId('instructions') .setText('Fill out the form to connect to a JSON data source.'); config .newTextInput() .setId('url') .setName('Enter the URL of a JSON data source') .setHelpText('e.g. https://my-url.org/json') .setPlaceholder('https://my-url.org/json'); config .newTextInput() .setId("reportid") .setName("reportid") .setAllowOverride(true); config .newTextInput() .setId("department") .setName("department") .setAllowOverride(true); config.setDateRangeRequired(false); return config.build(); } /** * Gets UrlFetch response and parses JSON * * @param {string} url The URL to get the data from * @returns {Object} The response object */ function fetchJSON(url,request) { try { if(request.configParams.department){ url = url + '?reportid=' + request.configParams.reportid + '&department=' + request.configParams.department; } else{ url = url + '?reportid=' + request.configParams.reportid; } var response = UrlFetchApp.fetch(url); } catch (e) { sendUserError('"' + url + '" returned an error:' + e); } try { var content = JSON.parse(response); } catch (e) { sendUserError('Invalid JSON format. ' + e); } return content; } /** * Fetches data. Either by calling getCachedData or fetchJSON, depending on the cache configuration parameter. * * @param {String} url The URL to get the data from * @param {Boolean} cache Parameter to determine whether the request should be cached * @returns {Object} The response object */ function fetchData(url,request) { if (!url || !url.match(/^https?:\/\/.+$/g)) { sendUserError('"' + url + '" is not a valid url.'); } try { var content = fetchJSON(url,request); } catch (e) { sendUserError( 'Your request could not be cached. The rows of your dataset probably exceed the 100KB cache limit.' ); } if (!content) sendUserError('"' + url + '" returned no content.'); return content; } /** * Matches the field value to a semantic * * @param {Mixed} value The field value * @param {Object} types The list of types * @return {string} The semantic type */ function getSemanticType(value, types) { if (!isNaN(parseFloat(value)) && isFinite(value)) { return types.NUMBER; } else if (value === true || value === false) { return types.BOOLEAN; } else if (typeof value != 'object' && value != null) { if ( value.match( new RegExp( /[-a-zA-Z0-9@:%_\+.~#?&//=]{2,256}\.[a-z]{2,4}\b(\/[-a-zA-Z0-9@:%_\+.~#?&//=]*)?/gi ) ) ) { return types.URL; } else if (!isNaN(Date.parse(value))) { return types.YEAR_MONTH_DAY_HOUR; } } return types.TEXT; } /** * Creates the fields * * @param {Object} fields The list of fields * @param {Object} types The list of types * @param {String} key The key value of the current element * @param {Mixed} value The value of the current element */ function createField(fields, types, key, value) { var semanticType = getSemanticType(value, types); var field = semanticType == types.NUMBER ? fields.newMetric() : fields.newDimension(); field.setType(semanticType); field.setId(key.replace(/\s/g, '_').toLowerCase()); field.setName(key); } /** * Handles keys for recursive fields * * @param {String} currentKey The key value of the current element * @param {Mixed} key The key value of the parent element * @returns {String} if true */ function getElementKey(key, currentKey) { if (currentKey == '' || currentKey == null) { return; } if (key != null) { return key + '.' + currentKey.replace('.', '_'); } return currentKey.replace('.', '_'); } /** * Extracts the objects recursive fields and adds it to fields * * @param {Object} fields The list of fields * @param {Object} types The list of types * @param {String} key The key value of the current element * @param {Mixed} value The value of the current element * @param {boolean} isInline if true */ function createFields(fields, types, key, value, isInline) { if (typeof value === 'object' && !Array.isArray(value) && value !== null) { Object.keys(value).forEach(function(currentKey) { var elementKey = getElementKey(key, currentKey); if (isInline && value[currentKey] != null) { createFields(fields, types, elementKey, value[currentKey], isInline); } else { createField(fields, types, currentKey, value); } }); } else if (key !== null) { createField(fields, types, key, value); } } /** * Parses first line of content to determine the data schema * * @param {Object} request getSchema/getData request parameter. * @param {Object} content The content object * @return {Object} An object with the connector configuration */ function getFields(request, content) { var cc = DataStudioApp.createCommunityConnector(); var fields = cc.getFields(); var types = cc.FieldType; var aggregations = cc.AggregationType; var isInline = request.configParams.nestedData === 'inline'; if (!Array.isArray(content)) content = [content]; if (typeof content[0] !== 'object' || content[0] === null) { sendUserError('Invalid JSON format'); } try { createFields(fields, types, null, content[0], isInline); } catch (e) { sendUserError('Unable to identify the data format of one of your fields.'); } return fields; } /** * Returns the schema for the given request. * * @param {Object} request Schema request parameters. * @returns {Object} Schema for the given request. */ function getSchema(request) { var content = fetchData(request.configParams.url, request); var fields = getFields(request, content).build(); return {schema: fields}; } /** * Performs a deep merge of objects and returns new object. Does not modify * objects (immutable) and merges arrays via concatenation. * Thanks to jhildenbiddle https://stackoverflow.com/users/4903063/jhildenbiddle * https://stackoverflow.com/questions/27936772/how-to-deep-merge-instead-of-shallow-merge * * @param Objects to merge * @returns {object} New object with merged key/values */ function mergeDeep() { var objects = Array.prototype.slice.call(arguments); return objects.reduce(function(prev, obj) { Object.keys(obj).forEach(function(key) { var pVal = prev[key]; var oVal = obj[key]; if (Array.isArray(pVal) && Array.isArray(oVal)) { prev[key] = pVal.concat.apply(pVal, toConsumableArray(oVal)); } else if (pVal === Object(pVal) && oVal === Object(oVal)) { prev[key] = mergeDeep(pVal, oVal); } else { prev[key] = oVal; } }); return prev; }, {}); } /** * Converts date strings to YYYYMMDDHH:mm:ss * * @param {String} val Date string * @returns {String} Converted date string */ function convertDate(val) { var date = new Date(val); return ( date.getUTCFullYear() + ('0' + (date.getUTCMonth() + 1)).slice(-2) + ('0' + date.getUTCDate()).slice(-2) + ('0' + date.getUTCHours()).slice(-2) ); } /** * Validates the row values. Only numbers, boolean, date and strings are allowed * * @param {Field} field The field declaration * @param {Mixed} val The value to validate * @returns {Mixed} Either a string or number */ function validateValue(field, val) { if (field.getType() == 'YEAR_MONTH_DAY_HOUR') { val = convertDate(val); } switch (typeof val) { case 'string': case 'number': case 'boolean': return val; case 'object': return JSON.stringify(val); } return ''; } /** * Returns the (nested) values for requested columns * * @param {Object} valuePaths Field name. If nested; field name and parent field name * @param {Object} row Current content row * @returns {Mixed} The field values for the columns */ function getColumnValue(valuePaths, row) { for (var index in valuePaths) { var currentPath = valuePaths[index]; if (row[currentPath] === null) { return ''; } if (row[currentPath] !== undefined) { row = row[currentPath]; continue; } var keys = Object.keys(row); for (var index_keys in keys) { var key = keys[index_keys].replace(/\s/g, '_').toLowerCase(); if (key == currentPath) { row = row[keys[index_keys]]; break; } } } return row; } /** * Returns an object containing only the requested columns * * @param {Object} content The content object * @param {Object} requestedFields Fields requested in the getData request. * @returns {Object} An object only containing the requested columns. */ function getColumns(content, requestedFields) { if (!Array.isArray(content)) content = [content]; return content.map(function(row) { var rowValues = []; requestedFields.asArray().forEach(function(field) { var valuePaths = field.getId().split('.'); var fieldValue = row === null ? '' : getColumnValue(valuePaths, row); rowValues.push(validateValue(field, fieldValue)); }); return {values: rowValues}; }); } /** * Returns the tabular data for the given request. * * @param {Object} request Data request parameters. * @returns {Object} Contains the schema and data for the given request. */ function getData(request) { var content = fetchData(request.configParams.url, request); var fields = getFields(request, content); var requestedFieldIds = request.fields.map(function(field) { return field.name; }); var requestedFields = fields.forIds(requestedFieldIds); return { schema: requestedFields.build(), rows: getColumns(content, requestedFields) }; }
For more information https://github.com/googledatastudio/community-connectors/tree/master/JSON-connect/src
Step 2: Deploy your code. After deployment, you get “Deployment id” and direct URL of your Data Source.
Step 3: Need to provide your source JSON URL and allow parameters to override by URL. Allow parameters for our need for data according to department wise and connect to the data source.
https://drive.google.com/file/d/1ys4RCPyWYC8wP31jWtCJcZpqt6SIKnMC/view?usp=sharing
So by using a custom connector, we can show data according to the role.
For publishing your Datasource script or otherwise, you can use less secure https://developers.google.com/datastudio/connector/pscc-requirements
Limits of embedding
- Bookmark links can’t be used with embedded reports. If you need a customized view of the data, use editor filter properties and date ranges applied to the report.
- Some websites may block the ability to embed their content.
- Recursive embedding is not allowed: while you can embed a Data Studio report into itself, you won’t see the URL embed component in the embedded report.
Related content
Toll mangement and command centre with TMCC
We’re passionately committed to helping our clients and their customers thrive, working side by side to drive customer value and results..
A Smarter Health Safety Solution
We’re passionately committed to helping our clients and their customers thrive, working side by side to drive customer value and results..
Building fastest loan portal in India
We’re passionately committed to helping our clients and their customers thrive, working side by side to drive customer value and results..
Toll mangement and command centre with TMCC
We’re passionately committed to helping our clients and their customers thrive, working side by side to drive customer value and results...
Toll mangement and command centre with TMCC
We’re passionately committed to helping our clients and their customers thrive, working side by side to drive customer value and results..
Entreprise IT Transformation and Automation
We understand user and market, create product strategy and design experience for customers and employees to make breakthrough digital products and services