Einstein Analytics – a complex SAQL case study for novices (Case 1)

Screen Shot 2020-05-05 at 21.52.52

By Nyjil Joshi and Radek Janhuba.

Thanks to Erika for patiently testing the results and it helped in shaping it up as the desired output.

Business Use Case: Business wants to know how the subscribers registered in a given month churn over time. They also want to find an average duration of all subscribers getting registered in every month and how they churn over the months.

Below diagram depicts the manual report in excel

Screen Shot 2020-05-05 at 21.58.00

As you can see in April 167 subscribers got registered and you can see how they churn over the months 1,2,3….

The end result of this report is that we need to average all registrations in all months and show how they churned in terms of average percentage.

Tool used : Salesforce.com – Einstein Discovery Plus

Solution:

  1. Load the csv file that is used to get this data into a dataset.
  2. Now that we have the data to get a baseline SAQL code try a compare table and using a Pivot table to see how the data is visualized. This is important as we are trying to get the data aligned like the above report so that calculations become easy.

Initial compare table and pivot to get the base SAQL

Screen Shot 2020-05-05 at 22.02.43
  • Data Set

q = load “datasetname”;

  • Filter

q = filter q by ‘Aktivitetskod’ == “xyz”;

q = filter q by date(‘Registreringsdatum_Year’, ‘Registreringsdatum_Month’, ‘Registreringsdatum_Day’) in [dateRange([2019,4,1], [2020,3,31])];

  • First query

q = foreach q generate ‘Registreringsdatum_Year’  + “~~~” + ‘Registreringsdatum_Month’ as ‘Registreringsdatum_Year~~~Registreringsdatum_Month’,coalesce(Passivdatum_Year,date_to_string(now(), “yyyy”)) as ‘Passivdatum_Year’,coalesce(Passivdatum_Month,date_to_string(now(), “MM”)) as ‘Passivdatum_Month’,

  • Derive months to churn , make it continous

number_to_string(case when Passivdatum is null

then date_diff(“month”, toDate(‘Registreringsdatum_sec_epoch’), toDate(date_to_epoch(now())))

else date_diff(“month”, toDate(‘Registreringsdatum_sec_epoch’), toDate(‘Passivdatum_sec_epoch’)) end,”00″) as ‘MonthsToChurn‘;

Screen Shot 2020-05-05 at 22.02.57
  • Re-arrange based on registration date , passive date and monthstochurn

result = group q by (‘Registreringsdatum_Year~~~Registreringsdatum_Month’, ‘Passivdatum_Year’, ‘Passivdatum_Month’,’MonthsToChurn’);

  • Calculate churned

result = foreach result generate q.’Registreringsdatum_Year~~~Registreringsdatum_Month’ as ‘Registreringsdatum_Year~~~Registreringsdatum_Month’, q.’Passivdatum_Year’ as ‘Passivdatum_Year’, q.’Passivdatum_Month’ as ‘Passivdatum_Month’, q.’MonthsToChurn’ as ‘MonthsToChurn’, count(q) as ‘Churned’;

Screen Shot 2020-05-05 at 22.06.28
  • Use fill to ensure even the months where no churn happened is also included, later on, we will fill values here as appropriate

result = fill result by (dateCols=(‘Passivdatum_Year’,’Passivdatum_Month’, “Y-M”),partition=’Registreringsdatum_Year~~~Registreringsdatum_Month’);

  • Getting the right month format, need to convert from string to number , then add 12 to convert to a month format with multiplying, putting default value of 0 where no-churn happened.

result = foreach result generate ‘Registreringsdatum_Year~~~Registreringsdatum_Month’ as ‘Registreringsdatum_Year~~~Registreringsdatum_Month’,

coalesce(‘MonthsToChurn’,number_to_string(string_to_number(‘Passivdatum_Month’) – string_to_number(substr(‘Registreringsdatum_Year~~~Registreringsdatum_Month’,8,9))+12*(string_to_number(‘Passivdatum_Year’)string_to_number(substr(‘Registreringsdatum_Year~~~Registreringsdatum_Month’,1,4))) ,”00″)– + ‘Passivdatum_Month’ + “XX” + “YY”)) as ‘MonthsToChurn‘,coalesce(‘Churned’,0) as ‘Churned’;

Screen Shot 2020-05-05 at 22.07.39
  • Eliminate any negative months we want only that is equal or greater than current month of the Registration datum.

result = filter result by string_to_number(‘MonthsToChurn’) >= 0;

result = group result by (‘Registreringsdatum_Year~~~Registreringsdatum_Month’, ‘MonthsToChurn’);

  • Calculate remaining percentage of subscribers at every month, or total subscribers remaining every month for the subscribers registered in a given month.

result = foreach result generate ‘Registreringsdatum_Year~~~Registreringsdatum_Month’, ‘MonthsToChurn‘,first(Churned) as ‘Churned‘, sum(sum(Churned)) over ([..0] partition by ‘Registreringsdatum_Year~~~Registreringsdatum_Month’ order by (‘Registreringsdatum_Year~~~Registreringsdatum_Month’,’MonthsToChurn’ desc)) as ‘TotalSubscribersmonthly‘,sum(sum(Churned)) over ([..0] partition by ‘Registreringsdatum_Year~~~Registreringsdatum_Month’ order by (‘Registreringsdatum_Year~~~Registreringsdatum_Month’,’MonthsToChurn’ desc)) -first(Churned) as ‘RemainingSubscribersmonthly‘,((sum(sum(Churned)) over ([..0] partition by ‘Registreringsdatum_Year~~~Registreringsdatum_Month’ order by(‘Registreringsdatum_Year~~~Registreringsdatum_Month’,’MonthsToChurn’ desc))first(Churned))) / (sum(sum(Churned)) over ([..11] partition by ‘Registreringsdatum_Year~~~Registreringsdatum_Month’ order by (‘Registreringsdatum_Year~~~Registreringsdatum_Month’,’MonthsToChurn’ desc))) as ‘MonthlyRemaining‘;

Screen Shot 2020-05-05 at 22.11.49
  • Re order
Screen Shot 2020-05-05 at 22.13.12

result = order result by (‘Registreringsdatum_Year~~~Registreringsdatum_Month’ asc, ‘MonthsToChurn’ asc);

  • Derive the columns once more in a foreach so that its easy to do calculations and aggregations

result = foreach result generate ‘Registreringsdatum_Year~~~Registreringsdatum_Month’ as ‘Registreringsdatum_Year~~~Registreringsdatum_Month’, ‘MonthsToChurn’ as ‘MonthsToChurn’,’TotalSubscribersmonthly’ as ‘TotalSubscribersmonthly’, ‘RemainingSubscribersmonthly’ as ‘RemainingSubscribersmonthly’,coalesce(‘MonthlyRemaining’,1) as ‘MonthlyRemaining’;

Screen Shot 2020-05-05 at 22.14.37
  • Finally the key statement, partition

t = group result by ‘MonthsToChurn’;t = foreach t generate ‘MonthsToChurn’,sum(‘TotalSubscribersmonthly’) as ‘TotalMonthly’,max(sum(‘TotalSubscribersmonthly’)) over ([..] partition by all) as ‘TotalTotal’,sum(‘TotalSubscribersmonthly’) / max(sum(‘TotalSubscribersmonthly’)) over ([..] partition by all) as ‘Average Duration’;

Screen Shot 2020-05-05 at 22.15.19

Note: How customers joined in each month churned over the one year period.

Disclaimer: This saql can be refined further, we can clean it up a bit and even make it simpler with fewer steps. But I will leave that to your imagination as to how to do it. 😉

PS: All data used are sample data for building prototypes. Image credits, image is taken from medium.com

Covid-19 Battle – Glimpses from Gods own country – Back2Life

Screen Shot 2020-04-11 at 11.04.48

Of late I started posting only on one topic Covid-19. For our life, as we know it, to come back to where it was before involves indomitable will that mankind needs to exhibit and also a Himalayan task that we need to scale. It is looming ahead of us like an ominous imposing specter.

So please bear with me when I keep posting this news that comes from a part of the world that I was a native of. I am again bringing to spotlight a small state in India (the most densely populated) that has shown India and the world how Covid-19 can be taken by the horns and stop it dead in its tracks. The important news that needs to be shared is anything positive that is happening on the battlefront and if we are winning share it so that it can be emulated elsewhere in the world.

The Washington Post is sharing something I was talking about a few days ago about the same thing. How Kerala is becoming the right example of how the battle against this malevolent being must be fought.

Its fought with a strategy, discipline, firm force where required and collaboration among communities supporting each other.

This battle was not smooth in a state where the density of the population is a staggering 859 people per square kilometer. Here are some snapshots from the frontline on the battlefield,

 

  1. Cops getting kicked by bike riders where they tried to dissuade them not to roam around (A husband and wife duo kicked a police officer). They were even shot at and mobbed.
  2. Nurses get spat upon and slapped in the isolation wards when too much isolation starts playing with people’s minds. Patients walking naked in front of nurses in defiance etc , it’s a long list. (Happened elsewhere in India)
  3. In apartment complexes when a family is identified to be Covid positive they are put on isolation and the neighbours cook food for them and place it in front of their door.
  4. To ease the tension and to ensure people are won over cops even went into impromptu dance sequences on road to convey the message, this was surprising for a community that only saw their tough side, this made us really love them and we started listening to them.
  5. https://www.thehindu.com/news/cities/Kochi/music-video-by-kochi-police-widely-shared/article31315353.ece?utm_source=taboola
  6. Every religion has some extreme thinking people likewise Kerala too got a few that brought in Virus after attending a religious congregation, they too had to be identified and contained. So it was a battle within the battle.
  7. Closing of state borders and allowing only essential supplies to pass through. Rocks and mud were used to build barricades in some of the bylanes of interstate roads to prevent unmonitored traffic.
  8. Drones getting used to monitor spaces
  9. State supplying food (my parents got a certain kg of rice, it was are a freshing experience, community kitchen established in all areas that supply food parcel especially for workers from other states that cant travel and also lost jobs)
  10. Another thing I am proud to state is the medical system in the state is so efficient in handling large scale issues, recently the entire state was flooded and we bounced back. The waters at some places submerged even the houses completely. I have been to many parts of the world but the doctors here are blessed with a healing touch and add the traditional Avurveda to the mixture and it becomes truly God’s own country (as we call it). (We do have our own issues, prejudices, we are a bit lazy in our state but work hard once we come outside of it, arrogances, drinking issues, false pride, defiance etc but in a humane society these emotions will be there and the important thing is how we keep it in check).

The Washinton Post does call out the political side too but that is of no interest for me here, I am focusing on the positive measures that are giving results.

Let me also say kudos to the leadership team in Kerala on an inspiring control they are having of the whole situation.

So during these times any success story has to be shared as it can become a beacon for others still in the dark.

I am also happy to see the role India is playing during these times, we are blessed to be lead by a legendary leader like our dear PM.

Namaste!

PS: The article that made me pen the above words.

https://www.washingtonpost.com/world/aggressive-testing-contact-tracing-cooked-meals-how-the-indian-state-of-kerala-flattened-its-coronavirus-curve/2020/04/10/3352e470-783e-11ea-a311-adb1344719a9_story.html

 

Glimpses of kerala

https://twitter.com/nikotjr/status/1239133797750206464

https://www.youtube.com/watch?v=R83BlU5nnbs

https://www.youtube.com/watch?v=JOoIF2XPMqs

https://www.theweek.in/videos/video-featured.playlist.video.6116715999001.html

Note: Thanks to Nikolay Timoschuk Jr, I took some concepts from his videos. Love you Brother.

 

 

 

 

 

 

Einstein Discovery (AI) – Customer churn analysis for a lottery company.

AIBigdata

Use case: ‘Gaming paradise’  (GP) is a non-profit organization with a focus on society and its well being. To fund their activities they rely on online gaming and subscription lotteries to make revenue. They are focussed on retaining a minimum count of profitable customers so that they have a reliable source of income that is predictable in nature. The management is keen to understand the factors as to why a subscriber drops out of active subscription and would like to get deep insights into the patterns that lead to churn for a given set of customers. They have millions of records showing subscription activities and would like to use this data to be fed to a statistical model to get insights and predictions on churn. Enter ‘The Einstein’.

Current state: Currently its impossible to go through the whole set of records and form an analysis on the churn. Looking at some sample data and then using intuition, a decision or hypothesis is made as to why the customer has left. It can be because of age-related issues or it can be due to customers reaching pension age. Deeper insights on every age category of customers are difficult to gauge.

Future State: GP wanted an AI tool that can be used to read through millions of records data and come up with a statistical analysis based on models that will tell them how the various independent variables or data contribute or influence the churn. They want what happened? and why it happened? analysis. The management also wanted to predict the behavior of reducing churn when certain combinations of variables or data values were selected and observed.

The tool and its modeling approach:

The AI tool we selected in this case study is the industry-leading product Salesforce Einstein Discovery tool.

  1. Data analysis: The current system where the data was stored was a legacy system built on SQL server DB. The data model of how the subscriber transactions were linked to campaigns, media, the customer(subscriber), payment types, etc were analyzed to understand the nature of data, the data model and the data quality. This data in the legacy system was extracted out using Apex Bulk API code with indexes set. The data was then loaded into standard objects, custom objects and big objects. We had to rely on big objects due to the huge data volume of transactions. Big objects have their own data store and licensing model from Salesforce.com. We had around more than 3M records in subscriber transactions and it took almost 2 weeks to complete the load. These data now reside in a salesforce account object, custom object, and a few big objects. The subscriber data that was the lowest grain record has the details on when the churn happened, how many lottery tickets purchased and delivered for a given subscription, the customer to which this record was linked, media that was used to connect to the customer, the campaign through which this customer came in, etc.
  2. Data modeling: After studying the legacy data we need to decide for churn analysis what all are the fields that need to be included int the model. We need to see if for the given subscription record we should bring in say media, campaign, customer age groups, location, ticket counts, deliveries made, payment methods, etc to name a few. We need to also identify that one key outcome variable based on which we can set a boolean ‘true’ or ‘false’ result. In this case, we had to see how to reduce churn. So a status field that contained the value ‘Active’ or ‘Passive’ (customer) was used to derive a true or false value. When we run the story on Einstein Discovery we will then use this field and say minimize churn = ‘Yes’ as the outcome and then will select all those variables that will have an impact on this outcome variable. In the later part of this modeling, we will also eliminate variables that will have a very strong correlation. These can affect the analysis by Einstein. An example of this will be if we are using a variable Churn = ‘Yes’ and this we are trying to minimize, ensure that in your model you don’t again give a variable that means the same as to say, for example, status = Active or Passive. This means the same and it can affect the analysis in the wrong way. Here what we are trying to predict we are also providing as an input to the model. Some of the fields we will also have to drop based on recommendations given when we run the story.
  1. Data extraction and transformation: One of the primary activities we need to do to supply data to the Einstein analytics engine is to create a dataset and fill it with the required data in a denormalized form. For this, we rely on dataflow utility in Einstein. Using this feature we can extract, transform and even derive new data that can be used to be provided to the statistical model for analysis. We rely on a few techniques like computeExpression, computeRelative, sfdcDigest, sfdcRegister, augment and many other tools to achieve what we want to do with the data before we register the dataset. We can come up with several models like for example we can have a model based on a particular product, we can have a model based on all subscriptions of the customer or only the latest one. The whole idea is to get insights based on various models we will architect to give the maximum insights.
  1. Data visualization: Once the data is in the dataset we can select the desired columns using a Lens for data exploration. We can do some preliminary assessment of the dataset using bars and trellis features on the Lens. We can filter the data add columns, use formula fields, write SAQL to create more complex visualizations like compareTable to find cumulative totals, CoGrouping to join two datasets based on the key fields, etc. One of the interesting requirements was to join two datasets, one had target values for a given month (total sales per month for an entire year) and another has the actual sales in a given month to date. Both of these data were in two different sources. They were joined using a Co Grouping and the cumulative totals were derived using compareTable functions. The data metrics for both the datasets can be different and we will have to correlate them using features like combo chart and single-axis  Here to join them I had to rely on dates. For the final data visualization, we had to create analytics dashboards showing various sales performance and forecasts using a collection of widgets.
  2. Running the story: Once you have done the analysis and scrutiny of the dataset that you will be using for the story using a lens you are ready to create the story. You can select your goal variable and it can be in this case ‘minimize churn=yes’. You can manually select your model variables, select only those that have an impact on the outcome variable. Ex age group pf people who drop out can have an impact on the outcome so we should factor this in. The amount spend might not have a similar impact and you can possibly exclude this from the list of variables in the model. We need to factor in a few pointers here: a) Segmentation: You can segment by a particular type of lottery product than mixing all products together.  An example will be, say you have 3 M records and there are 4 products out of which 3 will have more than a million and the last one will have say 1000 records we can select and run the story for each of the 3 products in 1M+ category separately. b) Columns with few predominant values: If you have a column data that shows up 90% in your dataset records its better to select only that product and do a churn analysis than mix with products existing in smaller numbers. c) Drivers: If drivers for a chosen segment are different we should create different models for the same and run the story. d) Multi-collinearity: This happens when there is a high correlation between predictor variables (independent variables used in the model).This creates redundant information that can skew the results. e) Outliers: They degrade the results. These are values beyond certain limits that have no occurrence try to find them with some trial runs and eliminate them from the model. f) Dates: Give special attention to dates. Einstein handles dates as a number. So let’s say that you have a birthdate 01/01/1974, it will translate this into seconds from a particular date in time and show as a long number. There will be the word epoch tagged to it and we will have to use this in date calculations.
  1. Fine-tuning the model (iteration):

We will have to repeat step 5 several times to get a good model. A good model can be detected using the ‘Model Evaluation’ section on the story results. You will see four sections. recommended updates, insights, models, and settings.

Screen Shot 2020-01-25 at 13.44.42

         When you go to the Model section you can see the ROC curve. Here observe the area under the curve. The more area you can see under the curve the better is the model. What you are seeing above is a good curve, its more realistic than showing all the data on the 1 side of the graph that is on the top line. Its also not falling on the middle line that says that your model is as reliable as tossing a coin. This graph shows the majority of the data are on 1 side which is good but it also shows a few data falling on the less accurate or positive side, this is ok as its a common observation in a large dataset. You can see that the accuracy is almost 0.92 and the true positive is 0.93.

The confusion matrix also shows the degree of true positives and negatives detected in the model.

7. Gaining insights and predicting outcomes:

Once you are convinced on the model its time to look at what the Einstein AI has to tell you in the form of 3 main information types:

  1. What happened: Here it will show you the churn of subscribers based on a strong predictor variable say ‘subscription date’ and based on the time frame you have mentioned in the model. (i am giving a very simple example here as these sections have tons of data for each independent variable and the single outcome variable).Basically, the analysis will show how all the independent variables influence the outcome variable we have defined, in this case, it is to minimize churn=yes.
  2. Why it happened:                                                                                                                         In this section of the analysis the effect of all the key independent variables on the outcome variable is explained in detail showing how the variables influenced the churn in a positive way (reduction) or in a negative way (increase). It calls out patterns and observations that might have lead to an increase or reduction in churn. Always the strongest influencing variable is displayed first and then in descending order of influence and impact other variables that influence will be called out with its impact on the outcome variable.
  3. Predictions § improvements:                                                                                                  In this section of the story analysis we have two sections a) Predictions and improvements and b) Difference between two variables and its impact on a certain course of action. a) Predictions and improvements: So here to put it in a real-life we can show the example of say selecting a Product and its performance when we select a channel say ‘Web’ or ‘Telemarketing’. It will show you if the churn becomes better or worse.b) What’s the difference: Here you can select two media or say two different channels and see if there is a difference in churn, will it be positive or negative.

Notes on statistical models used by Einstein:                                                                    When we are using correlation to assess the impact of various independent variables on a dependent variable there are many models that come to our mind.  Some of them are linear regression, residual sum of squares, quadratic or polynomial. Simple regression cannot be used here as business scenarios and data are very complex. So to ensure there are no issues of overfitting and underfitting  Einstein uses Piece-wise linear ridge regression with interactions. Overfitting results in memorizing but not predicting and leads to low training error but high prediction error. Ridge regression prevents overfitting and k-fold validation allows us to tune the ridge regression model to predict the best predictions.

Key challenges: One of the key challenges faced was the amount of information that Einstein gives out and we have to read through these insights. One way to evangelize the learning of this among the key stakeholders will be to conduct a workshop to show how the insights can be read and understood. There has to be a person from the business side who should get involved in these workshops. He can be a data scientist or anybody who has an understanding of the business and its data. The analysis will not be something that you run once and leave it, in fact, it will be an ongoing assessment and the tool will be used periodically to see the performance after we incorporate key actions based on initial insights given by Einstein. We will be revising the models, we can run an independent analysis of individual products, etc. The story will continue…

Credits: Thanks Radek Janhuba (Fluido, Finland) for being a guide to me in this exciting journey and for showing me the way out of tight spots 🙂

Marketing Cloud Einstein – Recommendations to success

Recently had a tryst with Marketing Cloud Einstein – Predictive intelligence. If you want to sense what your target audience is doing on the website you just send to them via a campaign the solution is incorporating predictive intelligence. Now how is this done? Seeing what the customers did or is doing or sensing what they are doing is now at the tip of your fingers. You need to use a few nifty technologies and then you are good to go in sensing your customer’s activities on your website.

A business case

An online consumer products company selling garments wants to set up a catalog of its products and want to make use of Einstein recommendations based on customer actions on their website.

Key things here are we need to set up a catalog of products. A CSV file with product details image links, keywords, product description etc. We will be relying on Personalization Builder and Email recommendation to accomplish this. We will be enabling ‘User Tracking’ and ‘Custom User Profile Attributes’. One of the attributes in this scenario you can add can be ‘explicit_preference’. This will be used to track for a given set of products what all are the keywords that we need to use to collect data based on user activities on a given product page. COLLECT CODE will push these details to your Einstein on AWS using a REST API call along with your  BU Id (MID) and product code. This way we can start tracking the AFFINITY of the user for the products he is visiting on the page. Einstein will start using this data for its model training. The session details will be pushed to the Marketing Cloud data extension ISO DataExtension. The recommendations will be put in a different data extension. The affinity scoring will be used to recommend products to user by Einstein. Example if he selected and viewed more of Black jackets for Biking category then more of similar black colored items from Biking category will be recommended like say gloves, helmet etc.

Catalogs can be updated manually using the enhancedFTP approach or even by streaming API’s. Collect codes we can attach to personas of the customer. There are also some areas where we need to get the support from professional services team of Salesforce like say triggers. An example will be if there is an abandoned cart and we need to trigger an action based on that we might have to reach out to Salesforce. SQL type queries can be written but for complex ones we will again have to connect with the professional services team.

A key important point to understand here is unlike service cloud or sales cloud Einstein, Marketing Cloud Einstein is housed with AWS. Yes, the statistical modeling and analysis all come from AWS side. Machine Learning API’s can be used there to work on a model and frequent data feeds from websites using Collect Codes can be used to train the model to provide more accurate recommendations based on a catalog.

A few interesting points to note;

  • We can override the Einstein recommendations, rule manager can be used to influence Einstein recommendations.
  • We can set up Predictive Email Logic, what it involves is setting up a Display and Logic using Personalization builder. We can have a logic that will enable Einstein to show you product details dynamically based on whats the latest updates to it. When you open an email with a product recommendation and later on when you revisit that email if anything new has been added those details, will also come to your email view, so its pretty up to date and this is accomplished using a code you generated from Display and Logic setups. This code will be embedded into your email template. Cool stuff!
  • There are data extensions that capture activities for Personalization Builder, Predictive Intelligence and Einstein reporting – ‘Einstein_MC_Predictive_Scores’
  • On the Journey builder tool that is used for orchestrating these customer journeys, few new features have come in and one of them is Einstein Splits. Use the Einstein Split activity in Journey Builder to segment customers into logical customer journeys based on Einstein Engagement Scoring data. There are five Einstein Split options:
    • Persona Split – Engage customers based on their Einstein Engagement Scoring Personas.
    • Conversion Split – Engage customers based on their likelihood to make a purchase, download content, or complete a form on your website.
    • Email Click Split – Engage customers based on their likelihood to click a link.
    • Retention Split – Engage customers based on their likelihood to continue as a subscriber.
    • Email Open Split – Engage customers based on their likelihood to open an email.

One this to understand here is the scoring persona’s mentioned above is based on the affinity scoring that was done by Einstein based on the explicit_preference that we talked about above.

The Einstein personals can be loyalists, window shoppers, selective subscribers and ‘win back’ subscribers. I think this is a very simple classification and it makes sense.

Now based on what all I said above to simulate all these you will need

  1. A sample website, that shows a catalog of products which customers can select, view and order. Salesforce has a nifty NTO website for training.
  2. An Einstein setup in AWS
  3. Marketing Cloud Org with Catalog and Einstein options
  4. A sample HTML that can be used to simulate Customer actions with Collect code (given below)

Collect Code plugged in HTML page

<html>
<head>
</head>
<body>
  _etmc.push([“setOrgId”, “1000100”]);      ——-> MID
  _etmc.push([“setUserInfo”, {
    “email”: “customeremail@yahoo.com”,
    “details”: {
      “explicit_preference”: “Camping”,    ——-> Remember this guy!! Einstein needs this for AFFINITY scoring.
      }
  }]);
  _etmc.push([“trackPageView”, { “item” : “1225100” }]); ———> Product Code
ALL Love Einstein!
</body>
</html>
A sample catalog record
ProductCode,ProductName,ProductType,ProductSubcategory,RegularPrice,SalePrice,OnlineAvailability,Keywords,ProductLink,ImageLink,ImageLinkThumb,ProductDescription
1000100,MEN’S APEX BIONIC JACKET,Mens,jackets,149,119.2,Y,hiking~camping~cold weather~insulated,http://ntoretail.com/shop/mens/jackets/1000100,https://s3-us-west,,,Camping

 

 

 

 

 

Salesforce partner summit – importance of participation.

Dear Readers,

I had the opportunity to attend one of the recent summits that happened in Bangalore. It was an enlightening one. Salesforce commitment to grow the ecosystem is really visionary in nature. They have started this whole movement on infusing knowledge right from college level in India and has even taken a step forward in identifying the super star’s on ‘Trailhead’ and connecting them with organizations. Truly commendable.

Without much ado let me pen down a few notes that i took on what you can all expect in a partner summit , like an insight into the kind of visions, data points and strategies that are shared.

Disclaimer “These are my own words and writing it down here based on my own understanding. Any mistakes here are purely mine and i take full responsibility. You are advised to directly contact salesforce for any guidance. Any clarification on figures Salesforce contacts at your organization will be the right point to engage.

My notes …

Introduction session notes

  1. Salesforce 2022 growth predicted at 13.8% CAGR
  2. Salesforce Services growth predicted at 33% CAGR
  3. Current Revenue at 10+B
  4. New Jobs prediction at 3.3 Million, 250K consultants (5 M including indirect jobs)
  5. 2 Million in India prediction. (Please Re-Verify)
  6. 400+ consulting partners
  7. India 50% + consultants, largest employee base outside US, largest developer base outside US.
  8. 7 key Industry verticals (with market share)
  • Financial (8%)
  • Healthcare (11%)
  • Retail & consumer (10%)
  • Manufacturing / Automotive (10%)
  • Communication & Media (9%)
  • Travel & Hospitality (7%)
  • Govt (4%)

 

Strategy – > Build blue prints / Create Roadmap / Launch sales plays / Bring solutions / Enable the field

Key Industries -> Finance / Health & Govt , expecting 300% growth , lowest attrition 2%. Healthcare rated 1st by Forrester and KLAS.

New segments

  • Healthcare for payers
  • Financial services for Comm Banking…

New Industries

  • Manufacturing / Consumer Goods
  • Govt (Ongoing)

Einstein Intelligence & Apps

  • [Your Data] -> Answers -> Feedback -> Machine Learning -> [Your Data]

Built in surveys OOTB.

Partners as design partners (with customer cases) to jointly develop products.

B2C – Customer story with marketing cloud and new focus.

  • 4rth Industrial revolution is AI
  • 4rth Marketing revolution is intelligent marketing
  • 52% clients will switch if we don’t market innovatively.
  • Using MC assess revenue via SMS , Email , Social
  • Communication based on behavior
  • 2 million SMS / Emails per month in India possible, used by a client. (Please reverify)
  • Various studios – Advt Studio (Google), Interactive Studio (Customer touch points)
  • Web side data flows into analytics
  • 130+ customers in India
  • Gave a solution walkthrough with a case study using DMP, Communities, and website – prospect to customer conversation across channels. Good slide is there to show journey.
  • Journey builder analytics powered by Google 360. Create leads , cases , add audience on the run, offers , interactive.

Integration Cloud – Mulesoft

  • Move away from project based to Platform Application Network based approach.
  • Point2Point not advisable always, reduce code, technical debt , centralize management
  • 150 connectors and API’s
  • 67% faster projects
  • Academic alliance. – Training and certification 1st 20 free architect training.
  • Create capacity – capability – customer readiness
  • Co Sell / Co Deliver / Build capacity
  • Big opportunity – 170000 SFDC customers with 130 Mule Soft customers.

10 Takeaways for partners

  • Lightning, all new orgs on lightning , by year end 100% parity with classic
  • Lightning flows – with integration calls
  • CPQ & Billing using Steelbricks , triple digit growths (200M revenue)
  • B2B customers (Suppliers & Vendors) using CloudCraze , built natively on SFDC.
  • Field Service Lightning (100M revenue), B2C ,Intelligent & Resource scheduling , Offline
  • IoT enterprise, can process 1B events a day, 100 M devices, low code
  • Distributed Marketing (connect with Fin advisors / Insurance / Auto dealers)
  • Smarter selling with Sales cloud Einstein (AI , Trends / Insights, Predictions)
  • Service Cloud , Einstein BOT’s – Case resolution, deflection, handle time reduce, solve faster
  • Einstein – Prediction Builder (On any object) / Discovery Builder (On any data)

Partner Marketing Center (PMC)

  • New portal for support for partners , lot of collaterals
  • Email marketing support
  • Web content syndication
  • Social media syndication
  • Videos for marketing with other collaterals
  • http://p.force.com/pmc

Dreamforce 2019 (quick notes)

Trailhead chat

  • Trailhead for students , 3000 students trained with super badges 100+, Rangers
  • Hiring – Lateral with partner trainings on Apex /VF
  • Hiring – Fresh out of college
  • Hiring – Universities / Colleges
  • Hiring – by Hunting based on social profiles
  • Million jobs in 4 years
  • Enabling Youth
  • Started in 2017 , 20000 students on the program
  • Modules in Trailhead , Superbadges , certifications, instructor led trainings
  • Training partners like ICT Chennai.
  • Training program covering 200 hrs , Partners with preferred colleges tie up
  • Partners can mentor
  • Starting GURUKUL program, 3000 slated to graduate.
  • CV / profiles can be shared by SFDC team if interested.
  • CTA – program mentoring , if completed Sys Arch and App Arch can join if only presentation pending , workshops by SFDC team.
  • Ohana – meetups , hiring possible, partners can play a role

Partner Scale programs

  • Partner community , must for keeping all updated , link certifications and trailhead account. Firms will be rewarded if all can complete this by this month end.
  • Webinars
  • Support groups , even industry wise available.
  • Advanced training for partners
  • NEW , Partner Adventures , single portal for a particular knowledge area like Marketing Cloud , Service Cloud etc.

Building demos with customers

  • “Demo Engineering” teams of SFDC can mentor (new terminology alert!)
  • They have created 2500 demos, demo components , 130 + dedicated team
  • 8 hrs TAT

Best practices in practice development (PPD)

  • PPD + Partner Community + Trailhead
  • Extend sphere of influence by Architects
  • Grow Trailblazer’s
  • Industry specific learning paths
  • ANZ – 80 people re-skilled, in 8 weeks
  • DX pilot programs
  • 51% PD1 certified candidates in India
  • Webinar taking very low!!

India / APAC plans

  • Main focus now shifting to APAC
  • India second biggest player for SFDC
  • Billion + revenue targeted
  • 50 B devices hitting the market , IoT , data science gathering momentum
  • 7T economy of India by 2026, 4rth biggest.
  • Next 5 years 700 to 3000 new developers in India SFDC office
  • Choose industries and specialize, Media / Hospitality new areas, Healthcare big.
  • Pricing in India –  value driven always…

PS: As you can see why attending the summit can be fruitful to know the possibilities

Note: Any wrong typos or information ready to rectify please contact me at nyjiljoshi@yahoo.com / 9980250014.

Recent News : https://www.thehindubusinessline.com/news/salesforcecom-to-train-1-lakh-students/article24773959.ece

 

 

Salesforce Marketing cloud & Data Science – positive cohesion

36454945 - social media,social network concept.

My tryst with MC began a year back when  i was asked to explore this platform and build a working model which will become the basis for laying a foundation in a new knowledge area in our Salesforce practice. One of the main challenge in learning this tool is its licensed org availability. Its a premium tool that is for those organizations that really does need some help with marketing heavy lifting in terms of sending in ad, communications to customers via email , phone and social media. The availability of various studios that help us design these contents for email , sms and social media advertising makes it a very powerful tool or a platform for marketing. The key challenge is not in learning the tool but in designing your contents and campaign processes that will leverage the tool in the most efficient manner possible.

You need to decide on your data load strategies into the MC either using enhancedFTP cloud folder by placing files or by using API’s exposed by the MC that can be called from other systems. We can even use middleware but need to explore the availability of suitable connectors. It seamlessly integrates with service cloud and sales cloud using MC connector. One key aspect that we need to be aware of is we can bring in data from objects in these modules to MC but from MC we can only bring in tracking data to service and sales cloud.

The data is stored in Lists or Data extensions. All these data of the customer is added to a master list called “All Subscribers”. The tool has capability to validate your content using various in-built tools like Validation Inspector, CAN-SPAM compliance check , setting up of sender and subscriber profiles. Preference center  can be used to set customer preferences. Complex data modelling can be implemented using the Contact builder tool, like say apart from using basic contact or subscriber data we can also store other related information of the contact that can be used in a marketing campaign.

We can use automation events to detect the placement of a file on the cloud folder and this can be used to kick start a complex journey builder process that can be used to send emails out and then capture customer responses back and update the data extensions from which the journey builder picked up individual subscribers for a send operation.

Normally a journey builder takes a few hours to a few days to complete its one run. Once an email is send the understanding is it takes 1 – 3 days before which we actually get a response from the subscriber. These responses are captured and updated back to data extension in the the same journey builder run context. AMP scripting can be used to personalize the send content on an email.

We can use cloudpage to capture information from customers from a link exposed on the email that we have send. When customer opens up the email it opens up a webpage that is build using cloudpage feaure. All responses of the customer can be captured and updated back to data extension using AMP scripting.

Mobile studio can be used for sending push notifications. For this purpose we can define the messages and also need to download an app that will receive these push notifications on the customers mobile phones. Using Social studio we can listen to the customers by creating personas of customers and proactively address any discussions that’s in progress live in the social media sphere of influence.

We can use predictive intelligence scripts that we can embed on the webpages to track the website visitor activities that will give us insights into the possibilities of a particular subscriber who can be converted to a customer.

Marketing cloud can ingest data from data science and do targeted campaigns on the segmented data. Some of the data science related activities are given below that can work in tandem with MC application.

  • Segment data externally using data science – We can use R or SAS statistical analysis tools to segment customer data based on various parameters and come up with segments based on models defined. These segments then can be used as an input to MC to be processed in a campaign. The segmented data in files will be loaded  to MC and executed as an outbound email campaign. The data when loaded to the folder will be detected by automation events and this will in turn call a journey builder activity. The process will read data loaded from data extension and will send email communication to subscribers. The responses of the subscribers will be captured via engagement split activity and update back into data extension. Customer feed backs can be captured via cloudpages and the responses can be updated back to date extension.
  • Machine learning – Extract repeated campaign responses over a time period and feed to a statistical model for prediction using Machine Learning. Identify your most promising customers who can be targeted. Big data can be of a big source of data that be used get data to do these statistical modelling. Google machine learning api’s can be used to get the predictions from trained models that can use deep learning to give us predictions on customer behaviour which in turn can be used to effectively market. Einstein is scaling to take on this role soon in near future if you look at the way its maturing. Hopefully.
  • Einstein Intent / Sentiment API’s – for conversion of an interaction on the website to a Service request. This involves creating a file that labels your customer queries and then creating a data set. Creating a model and training the model to give accurate predictions on what kind of an inquiry is the customer making and then creating a service request and routing to the right team. Assess the sentiments on the interaction between customer and agent on the website using natural language processing. I see Einstein itself maturing to such an extend that it can one day replace the need for R or SAS and can achieve most of the data analysis activities within Salesforce itself. There can be a specialized area developing here. Marketing cloud itself can make use of these intent and sentiment capture while doing surveys via emails. We can use cloud pages and capture feed backs from customer and in the process assess intents and sentiments thus driving further marketing strategy down the campaign trail. AMP scripts can be used to make a REST API call out to Einstein engine. (Disclaimer this part i need to yet to implement but its possible based on my explorations with Intent API call outs and cloudpages with AMP scripts)

Next >> Marketing Cloud and proximity marketing using low energy beacons

 

Financial trading platform for Banks and NBFC’s on Salesforce1

LU

CloudUnbound team recently completed the development of a trading platform for financial domain dealing in loans.The idea came in initially based on a discussion on the mounting NPA issue that the whole country was facing and thinking on how to create a platform for assisting Banks to dispose of their NPA’s. We were not the financial domain experts and so when we got a link to a bank marketing resource we slightly changed the path to Loan trading.It was a detour.

Our trading platform in financial products has 2 key players. Banks and NBFC’s on one side and the customers seeking loans on the other side. We facilitate their interaction and trading by giving them the platform which was built on Salesforce1.

Initial solution – Phase 1

We wanted to bring in something very different from the existing competitors out there and we took an approach of a platform where customers place a loan request (LR) and this is then displayed to the bank agents that are directly on-boarded on this platform. The LR is made visible to the bank agents based on the location and we did provided enhanced search criteria’s to narrow down their list to work on for the banks selection. The bank agents then based on customers profile and some key data like company he works for , salary , existing loans, designation and many more parameters that constitute the KYC will decide to bid for that LR by giving the best possible Base Rate that is possible.

Now based on our discussions with the banks we got mixed reactions on this approach as public sector banks were a bit wary here as they said they dont stand a chance as private sector with their many tricky ways to undercut rates will easily beat them to pulp if they are on this platform . Basically public sector looses out to private sector in this model of trading for loans where its purely based on loan rate reduction or bidding by reducing rates as much as possible that is allowed directly and indirectly by the banks.

Private sector banks on the other hand was jubilant , they loved the platform and idea. But as creators of the product we wanted justice to all as a trading platform. We wanted a platform that was just to all banking sectors including the NBFC’s.

Customer View

Customer

Bank View

Bank

So we went back to our drawing board;

Solution Phase 2

Our new solution was based the inputs that a bank or NBFC will need to show case more attractive features to their products than just rates. Ex Bank XYZ will have a House Loan that will have a given rate (of course it changes slightly based on KYC) with other features like tie up with a particular builder , extra goodies that it will offer if loan is taken from them etc. So now we gave power to the customer. Here banks configure their products on our platform with all features and customers can select banks and their products compare them and then based on the ones he likes places LR. This LR will then be diverted to the respective bank agents based on location.

The bank agent then looks at the KYC of the customer that has been uploaded by the customer and take a call on accepting or rejecting the LR. If the Bank accepts the LR then takes it to a successful closure LoanUnbound is paid a commission. In between the whole process customers are kept notified via sms or emails.Once the LR is accepted by the banks then the interaction happens directly with the Bank and the Customer. The final status of a successful loan granting or rejection is updated by the Banks.

A survey is also send to the customer to capture his feedback on the whole experience.

So some of the key benefits to the Customer:

  1. Free platform where he can place a LR instead of physically visiting bank branches that takes away a huge chunk of his productive time.
  2. Device agnostic platform , available on smart devices
  3. Communication from the system on the status of the loans
  4. Various types of loan like Home , car , educational , personal , gold etc
  5. LR are handled directly by bank agents and not by middlemen.

Key benefits to the Banks:

  1. A platform for loan market that they dont have to maintain , need to pay only a nominal maintenance fee for every user.
  2. Pay for Leads that are accepted or successful.
  3. Rejected Leads go to the Banks that have a high appetite for high risk leads.
  4. Agents can use it anytime and place as its device agnostic
  5. This solution approach can be redesigned with minimal effort to be used for  many verticals like Insurance , Metal trading , Micro-finance , Crowdfunding etc…

Salesforce1 solution design considerations

For startups i feel salesforc can be a costly solution however with some combination of licensing and custom user(customer) handling we can achieve the results that will fit our pockets. These initial solutions can evolve as we make money to embrace more features offered by Salesforce.com.

Some of the key solutions we incorporated given below

  1. To start off with we used a single user license – salesforce of course was not happy using a single license we are launching a platform for a mega market like India.
  2. I had to get into a lengthy 1 hour call with Salesforce architects to explain to them how it can be made possible , with a combination of custom users (Banks and Customers), custom session handling and purchasing space if it really fires up. Finally space was something they agreed on.
  3. We used bootstrap framework for that look and feel along with flexibility for changes.
  4. It was device agnostic – mobility was incorporated into design.
  5. All objects are custom.

Possibly we did this approach so as to take the prototype off the ground with minimal expense. The path ahead will be to move it to Customer communities where i can go with a session based login a month or user based license. In the long run this will be a more scalable model.

Market potential

Loan Market in India – 72,00,000 CR INR

NPA Market in India   – 8,00,000 CR INR

Loan Growth

The solution is still in prototype stage and we did have couple of meetings with a VC and a popular bank for funding. This solution was shown to the banks and they were excited to see it but did suggest changes for us to incorporate like product features instead of only rates. We did approach two VC’s and one was a bank, the VC was comparing it to another BankBazaar and so was not keen. We then took it to the bank that was looking for funding startups in financial areas and somehow we did gave a demo on the product and the possibility of this being used for Loans and even NPA’s the manager compared us to Alibaba. Now this was a very disappointing as the manager didn’t know what was Alibaba so now he had to take a call on our loan trading platform with a wrong understanding of the solutions out there on the market. The meeting was well over an hour and there were others waiting so we had to wrap up.

Stats

Managing the managed packages

MediaTempleFeaturedImage2

It’s a boon to have appexchange that churns out solutions after solutions for all our known business needs. We have apps that are free , like ours , CVT Codebackup and apps that are charged a premium. The apps come under two categories like managed and unmanaged.

Some of the applications that I have worked with are Remedyforce , Apttus , Jobscience, DocuSign etc to name a few. They are  a boon no doubt to have these readymade solutions to just pick from the market and plug-in and start using. But these are not something that need no maintenance in fact they need to be maintained all of the time. That’s why just like salesforce releases these paid apps on the appexchange too will have periodic releases. Now these releases can pose regular headaches if they are not monitored and understood correctly. Basically we need to have an eye out for impact analysis on these changes coming into our org. We of course have no control on what is the new feature that is coming in and have to blindly accept the push of these releases into our system.  They do give you a high level listing of features going in but the unknown fact is when we do a deployment from our side we can face issues due to certain test classes not implemented to cover the push or release of new components into our system or environment. We are left with trying to resolve all those errors that pop up when trying to deploy our own custom changes.

Some of my clients do reach out to get test classes implemented from these vendors but they charge a bomb just to write test classes. So now even though we had something ready to use by business and start running halfway through we realize we have got into a wedlock that is getting costly. This is especially true for SMB’s that slowly start to burn their fingers. There is no actual solution to fix this. Other way out is you develop your own in house and maintain it and over a period it becomes costly. To develop or not to develop is the question.

This is a question that needs to be answered understanding the urgency of the need , cost , scalability , in house development cost to managed package license and incidental cost etc. Here too we have no easy way out due diligence has to be done.

However if your decision is to rely on well established managed package and you are aware of the regular maintenance and the whole package of challenges that it will expose, I would recommend having an efficient monitoring of what is going in to these releases to be tracked well in advance in your org. Kind of a register keeper who will know what is coming in and then we doing an impact analysis to be well prepared from our side to take on those challenges when it happens. In this situation it will be a release by managed package vendor and subsequent code coverage achievement using test classes which they have to write or we need to execute.

So how can we improve at least the process to manage and prepare for these releases? One thought is coming up with a release management app that can be used to manage our internal releases at the same time function as a register to track components for these managed packages to be listed in a viewable format so that we know the release from the vendor , the components going in and we now can do some impact analysis in advance of the actual release by the vendor. Now this can also be as simple as using an excel file with release name , components and type coming in from vendor. The main benefit of importing this to release management app will be to efficiently audit , verify , track and do an impact analysis. It also helps in the long run. One case can be say after six months you are facing an issue with a component and you have no clue in which release or for what purpose this component was used. This is here the app will help you. You can do a search on the offending component and viola we have full history at the tip of your fingers to fix it or even disable it if it’s an obsolete one for your business.

So my solution here will be having a water tight internal process to do a study well in advance of the release of the managed app , do our homework internally and prepare for it in advance. A release management app will help in reducing our pains as now we can even leverage it as a register or a gatekeeper that will keep track of what goes in.

It’s our precious org and we do need to do a sanity check before hand and be ready rather than facing issues post releases and we are at the mercy of these vendors who can burn the hands of our esteemed clients. Now we don’t what them to go through this process again and again so at least as a planning in advance lets define a solid internal registry system in place.

Happy process building guys!

Memoirs of building a Salesforce.com practice from the ground up

The action begins:

When I joined the MNC midway October 2012 as a People manager little did I knew that it will become a real journey of a lifetime. I was working as a hardcore techie handling development activities in various CRM’s like Amdocs ClarifyCRM, Siebel, Oracle CRM On Demand, and Fusion Middleware.  I took charge as the Salesforce.com practice manager in India. Initially, we started off with seven people which later ballooned into eighty-eight at its peak. In the course of my tenure to date, I had to manage consultants and FTE’s from 5 locations in India and Cyberjaya in Malaysia. The primary challenge for me was to build the practice ground up. What followed the next few months was a real busy schedule of interviews and coordination with several supporting departments to make it happen. In the process, we scanned some two hundred CVs and conducted 150 interviews. Finally, we selected and hired fifty of them in 6 months. On one occasion we had to interview fifty people to select our 1st set of ready to deploy freshers. Now that was a huge effort, it was a real marathon. Meanwhile, in the market, the MNC’s competitors were really interested and curious as The MNC was starting a Consulting practice in a very niche skill set. We had this huge expectation to deliver from the word GO. The caption of The MNC “Power to do more” was literally true in my case as I was given the power and support to make things happen. I was allowed to make mistakes and when I made them I was made aware of it by my leader and I learned on the Job.

 

As a delivery leader, I had to do a plethora of roles, like delivery management, people management, trainer, mentor, pre-sales, motivator, and even come up with strategies to keep people engaged and motivated.  One of the activities that were very rewarding was making the team develop applications that helped us in day to day operations. This also made management activities more structured and streamlined. We had come up with Work – from – Home tracker, Timesheet tracker, Leave tracker, Project management tools that help in tracking all project activities, build in house. Periodic dashboards and reports mailed to managers helped them in being aware of the activities around them.  There was an  R&D team that handles various innovations in force.com and cloud space. Current team had also demonstrated adequate talent by implementing some of the toughest scenarios in the CRM universe by implementing  SOAP , REST , Web service callouts  , integration with Boomi  and Cast Iron , using java digital certificates and java key store. Implementing security in web service calls, using touch platforms to move applications on smartphones and integration with   Salesforce.com, CTI, and the hard phone was achieved in one of the projects.

 

One of the successful case studies in successful remote management was SFDC team in a different state. When they were given the empowerment and transparency in communications they rose to the occasion and became leader’s by themselves to take care of the practice and the projects assigned to them. Lack of a manager’s physical presence didn’t deter them from working as a successful cohesive unit fully focused on delivering on The MNC business commitments. They got opportunity to discover the leader in each one of them.

Overall, it has been a very enriching experience. Managing people develops the solution oriented focus as you look for solutions at issue’s that come your way every day and it also matures you to treat issues as interesting challenges that can always be resolved . But for this, one needs to take a step back and look at it from a broader perspective. When faced with real roadblock or challenges, one of the questions I ask myself or tell people is, in the course of human events how important is your current issue that is bogging you down? It’s a phase and you will recover. During such times message provided is :  regroup , refocus  and collaborate more. THE MNC Salesforce.com GTM India has been able to accomplish so much in a two quarter timeframe mainly due to adoption of this mantra.

The challenges:

The main challenges we faced were numerous and some of them were standing tall but with positive signs of overcoming over a time.  Some of them that we really had to grapple with are given below:

  • Facilities and Travel for my direct reports. This was a huge challenge we started and build the practice in a temporary set up and still was able to deliver till date. We had issues every day like network issues and my team adopted their solution to overcome this.
  • Some of the facilities were in remote locations and this made travel an issue, this introduced problem of work-life balance, fatigue, and some of them spending 1-3 hours of travel one way in a different city and Bangalore.
  • Mentoring of resources and meeting their aspirations like salary, role, work satisfaction, travel, etc.
  • Dealing with other departments and making them aware of our issues and trying to get help from them
  • Dealing with inter-departmental dynamics as well as dynamics in the Team.
  • Handling administrative activities and Project issues like team synergy issues with on-site and off-site, this was natural to happen as our practice is new and the team will have to go through the forming stages.
  • People giving up due to frustration and then trying to motivate them back to set a project on track, this project was a big success in the end.
  • A leader can never get tired, he is the person with whom they all come with their issues and you need to look ready to listen to their issues always and suggest solutions, yes sometimes I get personal issues too to resolve. It was an eye-opener something you don’t get from textbooks.
  • In spite of all the above, I still needed to keep them motivated this was the biggest challenge.
  • Managing teams from Malaysia (briefly) and from Bangalore, Noida, and in one phase indirectly even attend to issues of consultants from Trivandrum and Chennai.
  • Tough market competition from competitors
  • A dearth of Senior Architects
  • Organization re-org and half of my team were reassigned to new business unit.

Taking measured steps:

We need to be extra cautious when we take up projects from a different department as it can have a very sensitive history. You need to do your groundwork, learn about the people, and the set up more beforehand. I had a good fall on two occasions as I walked in without proper preparation and it was a big learning experience for me. Probably the only time I got my boss really frowning at me.  The politics and sentiments are some of the areas where we really need to handle well, we, in fact, need to do some homework else will get into a sticky situation. What you see might not be what you get from people and you need sometimes prod a little bit more to understand what is the actual and sometimes direct approach might also not work out.

Stepping it up:

Well in all these issues above we started getting innovative. Some of the novel ideas we implemented were

  • For remote coordination and discussions used the Chatter Collaboration tool.
  • Leveraged flexible working model like work-from-home 1-2 days a week.
  • Track work- from -home in an application so there is no misuse and we have better visibility.
  • For better project management developed our own PM Tools
  • Formed discussion groups on Chatter based on projects
  • Had management bulletin boards set up.
  • Weekly call specific to locations and one in 2 weeks we had combined calls.
  • All projects we brought in a PM, senior people were given the opportunity to play diverse roles.
  • We adapted to facilities issues and travel.
  • Ensured there is an everyday stand up meeting for a handshake between onsite and offsite.
  • My team actively took part in various stages of the project including Presales
  • We implemented and established our own center of excellence where POC’s were developed and it was used to sell our capability.
  • The entire team got certified in various areas. This helped in Platinum partner status.
  • Currently Indian market is getting focus and team getting involved in Pre Sales.
  • Resource utilization was captured effectively using a Dashboard and custom-built time tracker. Application dashboard is given below.

Capture1

The People factor:

We were lucky in getting a great team. They had the talent and all they needed was right motivation and guidance. They are also very young and going through a new learning curve in how to excel in a budding practice where things have to be done from scratch like methodology and processes. Even the tools needed to support a project had to be developed in house. Great job done by my the PMO team. One of the success stories was the ready to deploy fresher’s team. We were able to induct freshers and they were trained and guided by our technical leads and brought them up to speed fast. They are now working on projects and are all billable. We found if you have juniors in the team to be mentored this was causing a lot of feel good atmosphere in the team. Given the opportunity to mentor, my team rose to the occasion and took it in their stride.

Remote collaboration was an issue but in the end the team realized the sooner they get comfortable with it they better way we can work as a true global organization and with it brings in an opportunity to experience diverse cultures and more learning. Long distance travel issue was resolved by giving 1 – 2 days work – from – home.  This really showed results .  Application screen shot of the tracker given below:

Capture2

A promising future ahead:

The challenges are there every day and need to be on the feet all the time. Currently, the practice has a good healthy pipeline of projects, kudos to the tireless efforts put in by the leader’s, On-site team, India Team, and the Pre Sales. When I look back this experience made me evolve as an entrepreneur and now I am running my startup company called www.cloudunbound.com .

The key lesson I learned – to grow to the next level do not be afraid to explore and do things out of your comfort zone.

God Speed!

 

Cloud Sentinel – Remote patient surveillance & Support

With the advent of technology on smart devices it has now become possible to get person data real time and as they move. This has opened up a plethora of possibilities in the healthcare space. Medical devices fitted with sensors feeding information to Smart devices via Bluetooth or Wi-Fi. These devices in turn using mobile app to push information to cloud application. This has opened up various possibilities in the medical field. Before we delve into these infinite and exciting possibilities lets revisit the technology of sensors and smartphones and how they couple together.

remote

Sensors where in Smartphones even before they became ‘smart’. They were already using number keys and microphones. Rapidly cutting edge breakthroughs enabled phones to incorporate touchscreens , accelerometers, gyroscopes, GPS and cameras to name a few. Accelerometers and gyroscopes are used for orientation sensing, image stabilization and proximity sensing. They are widely used in pedometers and for sensing vibrations or movements. Some of the sensor related communications are via Bluetooth, Wi-Fi and Wi-Di.

An example of a sensor that can be used and in prototype stage is a contact lens that can monitor blood glucose level. It takes a reading every second. It uses a very small wireless chip and a miniaturized sensor that can detect glucose.
The chip and sensor are embedded between two layers of soft contact lens material. Brian Otis and Babak Parviz, both from Google, are the people behind the project. They wrote: “We’re testing prototypes that can generate a reading once per second. We’re also investigating the potential for this to serve as an early warning for the wearer, so we’re exploring integrating tiny LED lights that could light up to indicate that glucose levels have crossed above or below certain thresholds. It’s still early days for this technology, but we’ve completed multiple clinical research studies which are helping to refine our prototype.”

Smartphones already can be used to take blood-pressure readings or even do an ECG. ECG apps have been approved by the U.S. Food and Drug Administration for consumers and validated in many clinical studies. The apps’ data are immediately analyzed, graphed, displayed on-screen updated with new measurements, stored and shared. A patient can today take the reading and then share it with a doctor on whatsapp. He can even upload it via mobility to a cloud application where a doctor can analyze it with a panel of doctors on a dashboard. Hospitals of the future will be remote patient less data monitoring centers. Surveillance will be the 80% activity that will happen from these command centers which will have doctors and nurses as team members. They can even work from the comforts of their homes, while travelling or even in a party to attend to a quick expert session to a critical case.

Smartphones will soon become a mini lab technician who will get data from medical devices and analyze the results on a mobile app and for more detailed processing will also push into a cloud application which will be accessed in a central surveillance or analysis center. These data will also be shared with experts on the move for a second opinion. In near future physicians roles will shift. 90% of the time we might not even need them as medical devices, smartphones and super computers that process these information’s about patients via medical devices will even be in a position to prescribe you a medicine which will then be transmitted to a pharmacist or to your mobile. These computers will process billions of information about your full life cycle history on medical care, medicine usage till date, your allergies, combined with current data will give you a cognitive medical care recommendation. All of this starts with the journey from sensor to a smartphone and finally to cloud.

A surveillance service case study:

Imagine a scenario where by a patient needs monitoring as he moves around doing his daily chores or activities. Especially if this individual is a senior person and his children are located far away. I would like to take the reader through a chronology of events to depict how the technologies can work in tandem to avert a fatal event. We hereby give a name CloudSentinel for the whole surveillance solution.

  • Jagdish is a 70 year old man. He has long retired from work and is leading a peaceful settled life in Bangalore. Like any other senior citizen he is also affected by some minor ailments related to the advent of old age. Recently he had a stroke and his son who is settled in San Mateo, CA got him registered for CloudSentinel service offered by XYZ Company in tie up with ABC Healthcare Hospital.
  • He gets to see his father’s vital stats like blood pressure, glucose level, pulse beat etc on demand in a report or a live feed as applicable and required.
  • Event One: As usual Jagdish gets up one day morning at 6 am and is taking his normal jogging across a country side. He has a wrist band that monitors his pulse and is transmitting the details to his Smartphone that he always carries it with him. He is 30 minutes into his running session and all of a sudden experiences a pain in his chest and he immediately collapses.
  • Every minute the sensors in his wrist band that registers a pulse beat from his wrist is relying a signal to the app installed on his smartphone. This app we call as KeepAlive will rely his pulse beat every one minute to the cloud application that is Salesforce and his real time statistics are getting monitored in a command center 24×7. His heart attack sends a non-uniform rhythmic signal in a range of 2- 3 minutes. This signal is interpreted by the Android ADK listener and is then translated as data to Salesforce1 mobile app. An alert notification is send via salesforce cloud application to the command center console at the healthcare hospital and Jagadish’s observation parameters on the screen highlights in red. A notification is send to the physician in charge and he gets it on his smartphone. Another notification goes to the Ambulance driver and the nurse in charge. The driver locates the physical location on his smartphone using the map display on the KeepAlive app. This map is integrated with Salesforce using geolocation fields.
  • The ambulance locates the exact spot using google map and identifies the possible location of the patient. The ambulance pulls up near Jagdish who is now unconscious and lying on the ground.
  • Event Two: The nurse is administering the basic vital stats check with parameters getting taken using medical devices. The sensors in the medical devices is relying the data to mobile phone app using blue tooth, this data is relayed via salesforce real time to the control center Hospital where the physician is taking a look at the statistics that is getting fed into his screen and he is advising the Nurse in the ambulance on the procedures and the medication doses to be administered. As it happens the ambulance is racing towards the medical center. Now the statistics interchange between physician to physician can also happen via a mobile app like salesfore1 that relays the latest parameters.
  • Event Three: The patient still in unconscious state is admitted to the ICU and by the time he arrives at ICU, all his treatment details and procedures have started. Precious timely treatment given helped him get his life back.
  • Event Four: Jagdish is discharged from hospital and he is now in his own home and taking rest. He gets a message on his app to take his medication, the name of the medicine and the dosage is given to him in an alert. He takes the medicine and gives a feedback on the app as completed. This feedback is registered in the system and the physicians can see his timely intake of the medicine as completed. If they see no updates they can always give an outbound call via a CTI and Call Center console from Salesforce.com.
  • Event Five : Priam , Jagdish’s son who is in US logs into the CloudSentinel application and is able to see the whole series of event that happened in a report and his sees the final status as in recovery. He also checks the dashboard on his father’s body parameters displayed for the week and this gives him a sense of relief. He picks up the phone and talks to his father.
  • Cloud Sentinel – will ensure we all at one point in time will be monitored by medical angels that will one day save our life’s ensuring we live longer. Thanks to the combination of medical sensors, smart devices and applications along with a cloud platform we are taking the preventive and reactive healthcare to a whole new level.

PS: Soon physicians will be replaced by computers with cognitive capabilities and a patient data will be available in a central repository maintained by the government. This can be an extension of the solution where all medical centers sees the same data of the patient and a doctor now has full medical history of the patient in front of him which helps him prescribe the right treatment.

Key communication will be from wearable medical device to Smartphone via bluetooth and this data getting captured by an android application that pushes data to Cloud. The Use case can be a bit over the top but it explains the possibilities that we can bring on to this platform. The main competition will be from HIS systems but it can be easily overcome by ensure HIPAA compliance factors taken into account both from a product and process point of view.

Solution Components
==================
References
The communication between sensors Android OS in smartphones can work in the follow manner:
 Application Programming Interfaces (APIs) expose a cellphone’s sensors and sensor data to the cellphone programmer
 The Android Accessory Development Kit (ADK) even provides the ability to add external sensors in a standard way
 Access to the sensors on an Android phone are available through the class in the package of the android SDK , http://developer.android.com/sdk/

Supports: Accelerometer, Ambient Temperature, Gravity, Gyroscope, Light, Linear Acceleration, Magnetic Field, Orientation, Pressure, Proximity, Relative Humidity, Rotation Vector
Not all sensors are present in all phones
SensorManager.getSensorList () returns all the ones in your phone (Check reference section for a code snippet)
 Use Android Accessory Development Kit (ADK)
http://developer.android.com/guide/topics/usb/adk.html
ADK is for hardware development as the SDK is to software development (sets standards)

Built on top of Arduino
1. Arduino
http://www.arduino.cc
Some pins to connect to a sensor
A microcontroller you can program to interpret the data from the sensor
Some memory to hold your program
A USB port to send it somewhere useful
The key is the pins it uses to connect to sensors. Sensor manufacturers recognize these pin standards, so you don’t have to deal with all sorts of different communication protocols with interfacing with a sensor
4. http://www.sensorland.com/
5. http://www.epd-ee.eu/article/8112
6. Sensor Listener Class
7. https://developer.salesforce.com/wear

Disclaimer : Pictures attached and reference materials have been mentioned only for explaining the concept. Any objections please reach out to nyjijoshi@yahoo.com. Personally i have not implemented the above solution but just thinking out loud on how all these components can come together and make it work. Cloud Sentinel is a fictitious product or service.