Integrating SAP Predictive Analytics with SAP BW/4HANA and SAP BPC



hello my name is Jan Festa I am happy to have you here for this webinar on vwm v pc and their relationship with SAT predictive analytics and I'm presenting today with an owl well we'll be touching slightly on also future look and topic so be aware of this disclaimer but without further ado let's jump directly to the agenda so the plan for today is to look very briefly at sa PBW fahan and especially in how it relates to predictive in general many of you will be much more BW experts than I am I'm really focusing on pretty stephannie lytx but there's a certain and deep necessity between both products work together and but and that nicely fits really also from both products and after having dealt on to this the main part will be really on what you'd been expecting so the integration of predictive with BW for especially on showcasing a couple of very concrete scenarios on how this would will be done I'm trying to be quite technical there and really going down to the object and code level to give you a good feel on how this how this would be happening will not be demoing things but all the items will be screenshotted and so that you have full information such as – yeah essentially start your own project with it or reach out to us for the question and after those first two items Val will be taken over to showcase how the integration list because he would work so um enterprise data warehousing yeah all of us in enterprise software know how important this is I want to look at this on why this is so important for predictive what we love in the predictive world about enterprise data warehouses is that they are they have clean data centralized data reliable data lots of it lots of historical data and its really thee yeah the go-to place for reliable information and this is very much what predictive needs there is good nice saying that without data there's no data mining so and obviously with without good data there's no good data mining and enterprise data warehouses aim exactly at solving this data quality issue and this is very much a natural fit for us from the predict event to then take these findings and this reliable cleanse data to enhance it and to find patterns that make it even more interesting for you looking at concretely s8p business where how is not only an twice data warehouses in general lots of customers and trust their central interest data warehouse with us and the growing part of them is trusting in the power of having BW on Hana or even BW for Hana all of what we're saying today applies for both BW on Hana & BW for but obviously since BW for is is the more AHA the product that we are focusing especially on all the slides will be will be looking at BW for but be aware that BW on Hana is really working the same way from a predictive angle yeah this long history of BW integrating tightly tightly with Hana and this is and in a way that predictive can leverage is the best and we'll see this shortly that predictive looks at objects on a on a native database level and really only with the BW on Hana family and later on and BW for Hana all your objects could say all your BW objects could also generate their counterparts in on the database level and this is where PA or predictive analytics for sure comes in so you might have seen are these at various events already what BW for on is is aiming to achieve so simplicity this is also a shared value for us in predictive and predictive we're focusing very much on making the the journey from data to running model very smooth and simple don't make it a black card or mad mathematicians art but something that be every everyday lay interested data analyst person can really achieve and all the more so if the data is of a quality that you would have in an enterprise data warehouse openness is a big thing on the BW front which health sources I already mentioned a lot since we leverage this openness from the predictive analytics front modern interface are saying for predictive we are more and more on a journey to rejuvenate our new eyes and following fuelie principles and make it making adonai yeah up today look and feel and also high-performance that's very much at the core of what PA also does is leverage the power of hana both from the BW front but predictive calculations are also oftentimes very very critical performance wise because you're we're digging through so much data and so pushing your algorithms very much down to whether data is meaning hana is very much what we strove for and in architecting the predictive analytic solution as well if you have looked at BW for hana rollup sessions this slight midas might ring a bell as well so over the long-standing history of BW a lot of objects have been invented where there possibly have certain overlaps or space and specialized usages of the individual objects and now with a bit you for and thievery and the brave decisions that could be taken by introducing a new product family this the the object model has has become a lot more clear-cut now so that there is a clearer way of how you of what objects I use for persistency and which for for virtualization and really stripping things down only to these four bare minimum objects that are very very clear and dedicated to that task now we are already coming closer to to the predictive side of the house because those four objects since the appened of BW on Hanah and also with vw4 on all of these can if you set a single switch on your model environment can be generated into those calculation views and that are identical from a from from a data standpoint from a behavior standpoint and everything from a handling standpoint they can be can be considered equivalent entirely to their BW counterpart and sins predictive is is very much or predictive analytics um has its roots very much on the database front this is where the integration all happens you see further to the right or let me enable quickly my mouse so that it stays visible screen printer options visible so so apart from these those generated ones you can obviously formulate additional ones that bring in an additional data that might be adding additional calcite columns additional aggregations also also to altitude on top of these and all of these are available through SQL and this is yeah as I said where predictive looks at things so very brief on on BW as I said and now let's dig deep on the integration piece here so from a mere positioning and product family standpoint obviously the different objects family is the natural or we say default option for doing analytics on top of both pwns for and comprising both business intelligence and cross performance management and now our brainchild for today the predictive analytics these sliders are important because it's it's going to be a reference slide throughout the talk from now on so let's read it from our from left to right the first three columns that senses your reads also BW generating it objects into the harness formats that themselves are either consume calculation use or you can build additional harness artifacts on top of that now predictive analytics comes takes in and predictive analytics is a whole suite of tools that connect to lists BW well to a Hana world and give you the means to formulate predictive models and dig in this data and learn from it and once you learn from it take the structures that you found and formulate predictions so online connection here means directly accessing the data plus and this is really can't be stressed enough doing the calculations and as I said predictive data mining predictive machine learning is oftentimes a very very number-crunching effortful and this is all being pushed down to the Hana level so it's so the producer and analytics suite tell Donna how to look for patterns what to what to dig for what my target is what the type of functions I try to I try to use and then Hana does does it all and takes over and this is why it select the lightning fast now once the oldest digging has happened you would be having a model and at the end of the day model is nothing more than a complex mathematical equation and this complex mathematical equation is formulated in such a way that can easily be executed on harness straight away and this is where predictions comes in so you learned in the past from patterns patterns of how customer purchasing behavior was how MZ he actually enter he is leaving us churn behavior was how customers reactions to marketing campaigns was how retail store patterns were all of these and you look at the past try to find what drove that behavior and capture this and now on yet unseen data leverage this knowledge to do the predictions that's the whole that's the cycle that we always have in mind and now these predictions would then either materialize or be calculated virtually but happen in Hana and these and then be beyond artifacts like here a calculation use would take these predictions materialized of or calculate them on the fly and have them as in Hana objects and the next way and then accept to learn to take this Hana object is then to to encapsulate it in order to carry over to the BW world and there make it a standard infoprovider that would then be beefed up with those predictions so you will have your historical sales pattern plus the forecast on on how well sales evolves what products can be expected to be brought together what customers should be approached for the next marketing campaign also but all enriching the data that you already have with additional information that the new machine found were ya yeah in these patterns another technical way today to integrate is to leverage the so called Han analysis processes that is a technology that has been in BW for quite some time with the idea that there is an info provider something that works on that I don't listen for provided this can be a function module procedure something else went on this data and piping the output into yet another in for provider so this would be and an alternative way of achieving very similar thing but um being even close or closely tied to the BW world and way so our predictive analytics suite would come would calculate a model based on that interval provider and spit out a model hence this potentially complex mathematical equation and and and so that there you can then I surfaces as a procedure and Han analysis processes are able to consume procedures so take the data from an info provider hand it over to a procedure and the procedure routes would hand that output over to a network provider and this procedure can do anything Hana can do kind of text mining can spatial analysis can do predictive analytics library which is the library within Hana that has all standard text books algorithms nowadays implemented are of course being closely in being our way of of achieving essentially limitless analysis possibilities through all the algorithms that are and the academic community supporting it have and what where our heart is very dear to receipt automated procedures so automatically mining the data and automatically finding that model that is oftentimes from a time from a time and to value perspective oftentimes real the best way forward at the end of the day what this slides trying to say you can by virtue of predictive analytics build a procedure and this procedure can be readily consumed in Hanalei losses processes if you have looked at analyze this processes a lot of this will be and will be sounding familiar for for the rest I will be I will be showing very in detail slide on how on how this interaction goes now as promised let's look into a couple of very concrete scenarios on how such an integration could happen and scenario 1 predictions on a regular basis what could that potentially mean imagine you are a retailer and you wonder how many people will be coming to my stores next month next week how do I need to plan my staff accordingly how many products will be sold of this type how many of them do I need a source and so on and because time moves forward this is something that needs to be done regularly it's not bad this is a time crucial thing that I need the answer now so oftentimes it's it's it's smart to take the data and then do additional calculations on top of them and visit them so we've been working very closely with consumer products companies for example beefing up their market research data there or they are come or for retailers customer loyalty card state or also this is data that evolves over time and we're um you don't need an answer right now but you want to you want to do such pretty regularly and previous maintenance we see similar scenarios no need to calculate right on the fly but regularly a similar model at regular intervals on the other hand there are definitely cases for real-time analytics an example that directly comes to mind is fraud for example where ya imagine yourself being a being a bank or otherwise are handling customer transactions that we're where it's a bit unsure whether these will eventually be fulfilled or not from the customer end so they're having having a very fast turnaround on okay here's my here's my credit card ATM retrieval also and I want to I want to react them yeah in milliseconds to the individual transactions this is obviously a case where such a direct interaction make sense we've seen similar scenarios for used cars pricing for example customers submit in their transactions a various specifics of a currently can expect a price back and so this is oftentimes yeah we're so AB soon as transactions really of their type real-time is a necessity and then sometimes you want to really be even closer to your BW transformation flow and this is where Han analysis processes will will be your aid so as I said this slide that we had earlier on is resurfacing in this way or the other so for scenario 1 the left hand side we've already looked in and looked at a couple of times and now on the right hand side you see an odd newbies definitely call predictor factory predictor Factory is our tool as part of the predictive analytics suite that essentially is a repository of models so so it I'm thinking is they're always that imagine you found a good model and now you know you want your data scientists to move to the next big problem and make this solve problem an IT problem so you chair century check in that model into predictor factory and Martha and predicts a factory takes care of managing that model managing versions managing checking whether the modest electorate retraining of party if necessary and obviously I am applying this model regularly also potentially multiplying this model out we see this lot in in time series we have a big retail customer who's saying okay we need accurate predictions of bread because if Brad is a cent product but if our if the if we don't have bread at the bakery I could get at the entrance of our Elvis doors if we don't have it in the correct quantities people will go next door so we need a V we need this prediction how much Brad of these types will be sold in what store tomorrow and so the end all of these will be different models because they are the rye breads of different and wheat bread in store a different than store B and so these will be many models and managing them is an own task by itself and these here we have back at sa P home turf making things and the prize ready and and so taken the model that was once developed as SMS a template and multiplying this out along a multitude of stores and a multitude of off in that case bread types of products and product types product categories or so we have similar cases with with manufacturers who are saying okay I have I need to regularly calculate my production my production team at and possibly buy by product category type a product type I want to regularly call on Cisco does this that you want to do this sales pipeline analysis but to buy product category and region and customer and customer category so this easily means a lot of models and this is why we felt it is necessary to to to come up with a tool that manages this multitude of model you could also call it a model manager at the end of the day and this critics are Factory hens is taken those models and is repeatedly executing them along alongside a schedule that you define and then what it does it it obviously while training those models or retraining it pushes things back to Hana while and and all the all the delegation happens back to Hana but once the predictions are need to be calculated so concrete scores a concrete timeline concrete forecaster so she'll be generated there is one important area and I hope you're seeing Manaus in the corresponding screens off of your predictive factory but it's saying okay whatever I calculated for the future where do I write this to and here you can fix it and say okay um there is a table destination somehow the schema and a concrete table name and I'll write my scores my predictions into that table and you can be sure to find them at this schedule because they are written to a certain place BW can pick them up and easily then use standard means in Hana to join these predictions with the historical data that that you use for model training that your that your dashboard users eventually will need in the query so whatever so here the stander Hana modelling kicks in because at the end of the day this is not just another table and you can easily bring this together and if you're if you're fancy you can take this Hana artifact and expose that as an open ODS and competent provider and our me are as we speak developing the material of fourth decade this year to show this live for hands-on that you might might want to want to visit and participate in that goes through exact those of those things here so looking at at slightly more detail even so the model training has asserted the Mata pattern finding the model the weight finding of training is a step of its own and for that we need the data that live in BW but they expose and as external haunt of you so there as an excellent Holly of you we can talk to them from the predictive ENSO predictive Factory is aware of that view and allows you to specify a model or you can use any other tool in the predictive analytics suite like automatic manner or expert composer or so to formulate such a model but they will all share the same Hana source data and and then factory um will be will be writing predictions they get into a Hana table so then then because it then is Hana Hana data natural HANA data it can be it can be beefed up that was the bottom left bubbles on the last slide can be beefed up in Hana object that joins historical and data and predictions so you're all data and you can your forecast or so end of the sleepy w can then consume that same data or the same view in an open audience or composite provider it's already a so for pub for scenario one so – I'll go back here so if you want to generate predictions on a regular basis you're unsafe to use predict the factory to generate data into a given table and BW will pick it up from there next to though if you remember correctly we talked about real-time embedding so to that end you would typically a terrain a model and and capsulate this model into something that is you know obviously when you want to calculate something as long time you need code to do so at runtime so you need some way to have to produce this code and the easiest way for doing this for for models is to use the predictive suite and the automatic model training and generation capabilities that are in predictive analytics for creating such such such code and this can enter into that end we've introduced in the leader relief of last November we predict terminal 3 1 1 a feature that you can generate any model into a so called UDF function in uniform a user-defined function that's an artifact this an artifact can be taken it's it's it's just a function at the end of the day but it contains all the weights that are required for doing a prediction and you can you can take this function college from a calculation from a scripted calcium and then take with you it's just a lot of you and consume that in Hana so let's look at this in slightly more detail model training as before you have UN for providers all of you you your training is based on this han of you and you generate a user-defined function from that that's just in whole tool set a possibility of training a model and in taking that that model outcome as you need applied function that is just physically color code and you'll see that in a slide in a second so that's unit is that user-defined function is code it's a create function statement so you can execute that and the privilege will be those of this is sleeper user and up after that function yet another function but the one that does the predictions and you can do standard Hana modeling and standard BW model for doing so so I'll quickly go over slice and pick that in detail they since we'll be sending out the slides you can take this as reference it's it's becoming techy in the next five slides but for youthful reference and to make it a bit more tangible than that what you would typically do is you have your BW object that is a calculation view you have your UDF function that was gender that was created by your predictive analytic tools and you will build a joint information new objects are taken the information from this torrential data and your function to have to have both in one place then you do it consume them nbw so what would this mean in your PA tool you choose that you want to generate the code that you want to generate your model into a UDF function so what what is that it's a UDF function quickly for M for time reason I'll quickly step step over this so it's it's a bit of comment telling you okay how would you eventually use that more you save function and then it creates a function and it creates a function that is called claims fraud score that takes a couple of couple of input parameters and eventually returns an output and an output is a score and it does so based on the different ways that the incoming parameters have so so and and these ways is exactly what the model building essentially computed there's nothing here that you would read a lot because it's it's but it's it's just a mass function now this you can execute so you can you can take this create statement in your HANA console and then it becomes just another function f of M and an object in your catalog and then you can take and this is the slide with it five bubbles here you can take your your original data a that you learned upon and your new data then take that data and be cesare it with predictions and creating new calculations you from this and so you would take your take your data plus so this is your your calculation view and now you would beef this up in a scripted calculation you that calls and now you this this ring might sound familiar this claim short score is exactly the name of the UDF function that you created earlier and so there is a new field that is fraud score that is part of this new part of this new haunt of you and and run time it will it will run through all this weight matrix and find their score and call it is fraud score so and then if you carried back into VW you take that score and make it yet another field and you'll be W object and there you have it so and if you then inspect that new infoprovider we're using standard VW means you will then have your insurance fraud probability and potentially other schools that were calculated with it as part of your VW of your VW entity so there it's it and then all the all the analytics all the VI that you want to put on top all the only usage there you're absolutely free it's um all the heavy lifting is happening in this UDF function so let's juggen I'll need to speed up slightly I see since we don't want ever since you want to have also questions for time for questions and answers but I didn't want to I didn't want to skip on that so here now we're talking Hana analysis processes and allowances processes are as I said a means of connecting to infer providers and doing transformations underneath typically in the form of Hana procedures now how would we do that we would there are two ways of bringing this on a procedure into being either BW generates a skeleton for you and you fill it with life or you the procedure therapy beforehand and you you you wire it into the hand analysis process so here is a user-defined function that is generated from Harlem until you college from Ahana analysis process so model training asked before really very much for happen as in step number two you generate the UD to define function you execute it in your sequel edit or it is in there and then um we either take the no and now in this transaction in this an analysis process transaction you generate a procedure skeleton and from they call this generated UDF function and and afterwards your or any any analysis process has a data target so something that schools are being written to this can then be forwarded on to a composite provider or to yet another data transfer for process or even an analysis process for this matter so very quickly you have your handle Alyssa's processes you have your your source your procedure and your target and now you're saying okay I wanna there shall be a procedure and I want to generate it and and so it generates through that it's taken all the fields pushing them as the datasource pushing them down and then a coding window opens in a procedure window and they can do whatever you want like for example talk to the Han and honor the user-defined function that this other procedure that we generated in the last step and just just call it so it's a procedure wrapping another procedure what you would then do now we're on tap data target you can you just map what comes out of the procedure and map it on to where do you what do you do with it and here in the MA on a square FP you see that an auto insurance fraud probability is now is now being being handed over as well an alternative way is to not generate the procedure from the hull analysis process but to take a procedure that you already have so still so there the procedure would typically come into being because you feel very familiar with sequel coding so and there you have the full freedom off of single coding in you know in your sequel console with all the goodies that are that hana gives you with text and spatial and power and automated and our and so on and discipline and this procedure is yeah is following certain naming conventions and you can you can call and build this in so let's quickly jump over this as well so this here they're in our I need to I need to obey to a certain or to certain signature obviously other words on analysis process can't talk to it so here I'm writing a read-only procedure that takes some in tab and writes some out a band these are structures that have been defined and my data set in this case are really simple it's just a timeline and a level so imagine very simple time series like that's happening here and what that happens is that I would have in this case I'm doing a forecast I'm a coded procedure that I did earlier on that takes all these are all these configuration tables and eventually spits out a result and this result is then handed over to the our tab that the procedure gives gives outside so and and because a verb requires capital capital names for field its prediction level and this I take from the result tab so here all the early or the real modeling or the masked part would be happening in here and but that standard pile APL coding and the other documentation will you will be your friend for this what is interesting is that this is an outer brow and it's alt tab and now it's the same screen for analysis processes before bits are generated procedure to manage procedure that we declare here and that we start now wiring into the on analysis process so we register it and say okay how does my you do you the premises of my procedure how do they map to parameters that my one infant provider and my second improviser have I'm maxing I'm mapping them here I'm I'm still on my data source here here I'm doing my mapping on my data target and on my data target I'm doing the same thing and then I can execute that on activation and take this composite provider here and look at that and so here I now have by my measurement data and my predictive predicted level from my from my time series so and with that I'd like to hand over to Rao who will quickly run over the BBC slide as young okay thank you everybody and so what I'm going to show here is how it's a BBC predictive scenario so you this in this situation in this business case so what we want to see is how predictive analytics can help VPC scenario that is a business side of it and the technical side of it it involves bpc it involves the predictive analytics and it involves sa P Hana several other components and and it has a youth I will show you so the architecture how we did have this scenario and the business case here is we want to build the business want to enable their users to make data-driven decisions using the richer learning based time series forecasting and increase the accuracy of the planning and the predictions that can be used on the basis of forecast it in there basically PC application the background here is in the BBC what you do normally is you take the historical data and then you use like you know different ways and means to get start your planning and it's primarily gut based planning so what we do here with the what in the scenario we are trying to do is the predictive predictive analytics can give you a more accurate forecasting and then if you start your planning scenario with a planning scenario with the results of the predictive so you can get there you can get a better better planning business that's the goal of it we have this for all technical systems so I want to give a quick quick thing is like you know in the scenario so what we are doing is we have let's imagine a company having hundreds of products and you it is very difficult for any anybody to manage the models for each product line or each product and integrate integrate with a we integrate with any planning solution so what here plan to do is we have the ability to the segmented forecasting in the predictive factory so what it does you forecast on a single product and then you segment it based on given key are given to given set up case so that way when you start one product and you can go to multiple you can segment it into the different levels and different granularities depending upon your reversal models and what the other creatures in the predictive factories the politifact we can say if the predictive factory can you can schedule the forecasting and modal training and everything in the background so even if you have a hundreds of models you can schedule in the background and the results can go back this's can be saved into V PC Hana and these results are going to be the feed per your feed for the VP see what here I am trying to show the architecture has a simple architecture there are many ways to do this one this is one of the way we want to show you like you know how the whole scenario work from the process side of it so here what we are looking at is like if you have we have a historical data and historical data is in Hana right and then you can predict using the pedicure factory and the results are saved to the Hana table okay and what we can what you can do is you can create a wrapper for the wrapper the wrapper with a ham rapper in Ahana and this R this will be exposed the predictive results and you can you have an option to enrich the predicted results what you got from the predictive factory and then you got immortal in your VW the BW you can create their open or ES views but that can that can fetch the results from the Hana Hana view where the predicted results are and other I view how the advanced data store objects where you are real time a sitting in and then you can build a composite for wider and then you expose such as an editor after building a Grecian level you can expose the data set to the planning layer out so now look what happens here so what happens here is when the planner comes to the panel comes and opens his layout what she is seeing here is pretty clear results combined with the real-time data and then he can start his planning and this is he can start his planning that will be the thing that's where you are starting your planning with a more accurate but this is a this is a one of the option and and the second thing is second thing is we have a little bit of chocolate interactive thing here interactive thing here so what happens here is your bit as you are aware that we DPC is the driver based planning and there are some times you want to have a scenarios where you you want to you want to do some simulation if you want to give it an extra focus for the particular product or product lines but so in this in this screen like you know this is this back this scenario is with the customer has a huge new products and he don't want to do all products interactive 2d but what she does is use good number of products are the majority of the products with a predictive factory so that if use for the massive adoption massive like math you an option for all forecasting results and whereas the selected products we want to do do some simulation so what you can do here is what the user wants to simulate simulate the simulate the predictions if a certain drivers okay so so you imagine that here that is your V PC is modeled based on certain scenes certain drivers and then those drivers like key inputs into the into the earth analytic applications for example if you have a are you you are one of your driver for that your VP model is a thighs of oil okay you can have the price of oil or price of the other commodity as an input in the input in the analytical application and then by changing the prize in the input another input a five but by changing the price in the analytic application you can simulate by cought you can simulate the predictive results by calling the APL on Hana so you you trigger a simulation that goes to the APL an Hana and then which gets the sales history and then do the predictions and send the planning results back to the analytic application so here the user have the ability to do certain simulations one two or three yeah so he can do this use it you can simulate the sort of several iterations and that and then you can have the multiple drivers also and so here here like you know once the user finalizes that simulation at particular particular combination or particular level of the price he can save the results so when he is saying the results you can use use the planning framework the planning functions that are part of the VPC and BPA and Hana and then you see the results back into the predict you back into the VPC and the planning framework now once do this once we do this one technically goes to the same scenario where you have predictive results are sitting in the Hana table and then the same thing will work exactly the same fashion so and at the end what does the key benefits of this one is with the predictive analytics you can identify the business drivers that is a very starting step which is like a nicer part of the predictive analytics with once you complete your model once you build your model you identify I that it helps you to identify the business divers and those businesses I was you use starting in starting to build predict VPC models and secondly you get accurate forecasting decisions for the planning process so the first so you are because of the because of the ability of a segmented forecast in the predictive factory you can you can you can forecast for all the products and product lines and then send the results and get more accurate results for planning and the mask enablement again it is a segmented it is a feature of that predictive factory to sync with forecast hundreds of products and they are managed by there they can be automatically managed and then and then it's a seamless process when you are using the embedded and then when you are using the embedded it is a 16 bliss process it is between the PA and B PC where as you want to use the standard models there are some workarounds the those workarounds would be like okay you can do the predictive factory you can do the forecasting from the predictive factory and save the results to the database whare wherever you want to prepare and again you can use the Data Manager functionality of you can use the Data Manager functionality of the BBC to upload the forecasted data into into the BBC so that way if you are using the you are using the V PC standard that you can use it and if you are using the be PC embedded this is this is more seams and you can have more then a lot of other features that can benefit a fit the planning process yeah thank you very much so of our two questions that came in very great very recently ts is classic because you've supported with predictive analytics and Fiola Dooley answered how can execute the predictive procedure and the planning function wow that's beyond me that's something you can take classic because he supported yeah yeah sure so that the planning function the idea here is when you are using that when you are using the planning function you can have the parameters and the planning function has the area basically let me step back the APL you you create an APL procedure but to do the you do the projection and this APL procedure can be consumed in the planning function which the planning with with the planning function parameters you can pass the para you can park those parameters from the planning function to the APL and the APL execute superlative model and filter results back so that's the way the planning function it works and again the BBC is some question is coming up standard V PC I've embedded I guess the embedded embedded has is like you know is like a more native in nature it is easy to easy to monitor and easy to let you know cut the air of the from the process side but however if you are using the standard V PC the you got to do some manual steps of some liquid or disconnected steps that definitely you can do that one where because the planning process is not that you are going to do on a day-to-day basis so you can you can you can predict the predictive forecasting and then you can manually are you can auto use the B pcs data manager function to upload about V PC side of the table V PC side so that's the the one way you can do it yeah okay then there's an very interesting question from from from Brian further up in the chat well the market extra chef is that business object on top of its poor and PW for Hana it seems that it's actually just enough of Hana and not actually integrated with either product and application is an absolute for s for LFO with s for we are we are building a product or using a product that is called predictive analytics integrator that is essentially a knob up to Hana interim layer that we will be releasing to customers and partners for a very low velopment in the SE next half year from current planning for BW you're completely right and there I'd be extremely we are currently in in contact with BW for with a BW development colleagues to make the actual interaction with VW entities and not the Hana entities closer and we are looking for customer cases on what is most urgently required there to go beyond the scenarios that we present today where is you rightly site that we integrate on Hana layer only how to make it more BW where make it BW make things callable from transformations make b/w tap into predictor factory model repository publishing of models right into BW all of these but they're also for to to to assess best I'd be extremely interested to under two to learn from your requirements and possibly even schedule calls with PW development PA development and new customers for concrete ideas on how you envision such situations to be they're a heavily invite you for feedback thank you thank you

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *