Quantcast
Channel: SAP NetWeaver Gateway Developer Center
Viewing all 49 articles
Browse latest View live

SAP Gateway speaks JSON and PhoneGap talks device or…” How I learned about a match made in heaven”.

$
0
0

I am an avid user of mobile devices and I love them at home and around work, but never really had any interest in learning how to develop native applications for them, as it would mean that I had to learn yet another language that I would most certainly forget how to use right away due to lack of practice.

Having said that I never the less always felt that I am kind of missing the train in some ways seeing that mobile devices are everywhere and they have become a huge factor for companies when it comes to being able to reach individual customers, but also in being able to provide better and more efficient services to their employees and business partners.

 

This feeling got me interested in mobile frameworks like Sencha Touch, Appcelerator or JQuery Mobile that allow you to build web pages for mobile apps.  I was primarily attracted by the fact that such applications would be written in HTML and JavaScript, which I already knew, and on top would be device independent, or I should say that is what their marketing speak promises. I had to pretty quickly find out that this approach, while quite ok for some use cases, does not work that well for others. These frameworks are designed to be put onto a Web-Server and are typically served to the device through an existing internet connection. Some of these frameworks provide packaging capabilities that allow you to deploy the application on the device, but as of this writing, this packaging does not allow you to access OS functionality like Accelerometer, Compass, Camera, the Event System of the Device, the File System, Geolocation and many more features.

 

Somewhat disappointed, I walked away from them although I was impressed by their capabilities and really liked their approach. I really did not look back until recently, when I got introduced to PhoneGap. PhoneGap, much like NimbleKit, are device dependent libraries that consist of a WebView capable of rendering local HTML and JavaScript files to build the UI and JavaScript API’s on the OS side to allow you to talk to phone capabilities like the ones mentioned above. Naturally I had to try this out, knowing that if it would work, I could just put one of the Web Frameworks on top of it and finally have a set of tools that would allow me to build applications not only for one device, but potentially for all devices that are supported by PhoneGap or NimbleKit with no native development knowledge.

 

Container libraries like PhoneGap and a Web Framework on top is already a quite attractive combination for a developer,  but the idea of being able to read data from an SAP Source by using the Gateway on top is killer. Up to the SP3 of SAP Gateway have required you to not only call a Gateway Service but also to parse the Atom feed that the Gateway would return to your application. Not anymore. Since SP4 the SAP Gateway talks JSON and thanks to the good guys of ES Workplace, their Gateway is already on SP4. This combo got me really excited. I was thinking “Gateway, JSON, PhoneGap, JavaScript … this should work like a charm”. So let’s start by trying to make PhoneGap talk to SAP Gateway.

 

Following are the set of pieces and a brief description on how to make this work. I have so far not added a WebFramework but I will for sure in the future. Right now, it is only PhoneGap and Gateway. I only had to write a few lines of HTML and JavaScript to make this work which should give you an idea how simple yet powerful this combo is.

 

I made my Android based PhoneGap project available for you to pull from GitHub in case you are interested in it. I assume migrating this to, let’s say iOS, should be a snap.

 

First things first. You have to start with building a Project for the device you will eventually run your app on.  PhoneGap has various “Getting
started “  tutorials
showing you how to do this. If you intend to make HTTP calls from PhoneGap, your natural next step is to find out how to make this work. PhoneGap is essentially a WebView and you therefore do HTTP calls with the almighty XMLHttpRequest directly from JavaScript. People that know XMLHttpRequest will immediately think that they will have to work around the security restrictions of JavaScript that restrict cross domain http calls. This is not the case on PhoneGap. PhoneGap provides configuration entries inside their cordova.xml file that allows you to specify the domain(s) that your application is allowed to call.

 

<accessorigin=https://gw.esworkplace.sap.com/>

 

A quick entry into the cordova.xml file for ES Workplace and we are able to make calls to it.

 

Next on the list is looking at XMLHttpRequest itself. The documentation that I used showed that the open method has 5 parameters, type-of-request, URL, Synchronous or Asynchronous call, username and password. Filling them with the right parameters should not be that complicated to read something off of ES Workplace. I figured that the first parameter certainly has to be a “GET” and the second one a valid URL from ES Workplace. Looking around, I found the list of available Gateway Services on ES Workplace. To get the service to respond with JSON, you simply exchangethe $format=xml in the URL with $format=json. Going through the services on ES Workplace, I found that not all services do this and I checked back with the ES Workplace folks. The reason for this is that some services on the ES Workplace were built with an older framework that has  since been replaced with Gateway. If you find such a service, do not try to make it work with JSON.  Simply move on to another one that does work. I found that the ones around the flight example work quite well and I ended up using the list of agencies.

 

http://gw.esworkplace.sap.com/sap/opu/odata/IWBEP/RMTSAMPLEFLIGHT_2/TravelAgencies/?$format=json

 

Although ES Workplace requires a username and password, I thought that this would not be a problem at all. Simply call the open method with GET, URL, false (for asynchronous calls), username andpassword and be done. Not so much on PhoneGap. Searching on the web, I found out that PhoneGap does not support the username and password parameters but I had to put them manually into the request header. This required a capability to encode the password into a Base64 string which at first I did not know how to do. Eventually I stumbled over a number of tutorials that said I should use the Web Toolkit Base64 Java Script library for it. I downloaded the code from the website, put it into a base64.js file inside my project and made the necessary references to it. I extracted some of my project code here but please have a look at the project for the full example and how it works.

 

// Call the Base64 function and build HTTP authentication header
function makeBasicAuth(uname, pword){   var tok = uname + ":" + pword;   var hash = Base64.encode(tok);
}

// call the Gateway service with XMLHttpRequest passing Authentication as header
function getAgencies(uname, pword){
      ...     var auth  = makeBasicAuth(uname, pword);     var req = XMLHttpRequest();     req.open("GET",http://gw.esworkplace.sap.com/sap/opu/odata/IWBEP/RMTSAMPLEFLIGHT_2/TravelAgencies/?$format=json, true);     req.setRequestHeader('Authorization', auth);    ... the onreadystatechanged function definition goes here (see project on GitHub for details)...        // Finally send the XMLHttpRequest   req.send();   

 

The only thing that is left is making sense out of the response. That in turn is very simple as it only requires you to make use of the JavaScript JSON.parse functionality.

 

var obj = JSON.parse(req.responseText);

 

JSON.parse returns a JavaScript object and you are left with looping over the object picking out the information you would like to display.For the agency service, the array of agencies is nested inside the obj.d as the results array and you can do a for loop over them, sticking them into the DOM tree with the innerHTML function of the html document.

 

for (var i = 0; i < obj.d.results.length; i++)   document.getElementById('lbAgencies').innerHTML = obj.d.results[i].Name

 

 

If you are interested in what else is inside the response stick, copy the response that you get from the Gateway into one of the JSON beautifier online apps.

This is basically it for reading data from SAP Gateway into the PhoneGap container. Certainly this is only the beginning of something that I think is very promising. In the coming weeks I will go ahead and experiment some more with this combo and update you on my results.

 

At last here is a quick summary of the links provided throughout the document:

 

 

 


The University of Mississippi: Debugging and Refining Our Gateway Services

$
0
0

As Ole Miss continues to expand its usage of SAP’s NetWeaver Gateway, we have learned a few useful tips and tricks towards debugging and refining our OData feeds. I would like to take this opportunity to share these, some of which seem blatantly obvious now.

 

Background

At the University of Mississippi we utilize the speed and portability of SAP’s NetWeaver Gateway on multiple fronts. Some of our more traffic-heavy “game changers” are the following:

 

  • Official Ole Miss App ~ iPad Edition– An informational app which integrates with our backend SAP system to provide an interactive campus map, sports scores, calendar events, and directories which include people, departments, organizations, committees, policies, and more.
  • Attendance Tracking System– A pilot program integrated with the SLcM Attendance Tracking package allowing the use of touch screen barcode scanners placed in classrooms.
  • Admissions Counselor iPad App– An app used by traveling admissions counselors to view and update student records as they visit high schools for recruitment.
  • A variety of web based solutions– Services that expose our backend SAP data on people, courses, departments, committees, to our public-facing website.

 

Common Mistakes

Selecting a valid “key” value to distinguish individual entries is one of the most common challenges we have faced with our services. You have the ability to set keys to whatever you like; however, there is not an immediate check to ensure that what you have chosen is, in fact, a well-formed key.

 

With many of our projects,  a developer creates a gateway service for a long-standing SAP Remote Function Call (RFC) from which the original developer is no longer in the picture. The new developer wraps and deploys the service and passes it to the consumer for use.

 

Non-Unique Keys

For many of our Web-based implementations, PHP was used to consume the OData feed, and this is where errant behavior was noticed. The PHP OData library,when given a query feed without a truly unique key, will process the feed without a warning or error. The resulting dataset will have lost values for entries with matching keys. The non-key fields will be lost except for the first entry due to the consumption mechanisms of the PHP OData library.

 

NULL Value Keys

As is common with some older SAP implementations, longer text fields (i.e., notes) are split into finite character length fields and placed in a table return. In an effort to return information about our standing committees, we generated and began consumption of OData feeds to retrieve descriptions and memberships for display on our public facing website. Of the 200+ committees, queries for 99% worked as expected.  However, in some cases, a Gateway error was occurring and was visible even viewing the XML feed directly in a web browser.

 

 

Error Message:
/IWFND/COD/011 - The SAP system could not process your request. Contact your administrator

 

 

 

 

 

 

/IWFND/ERROR_LOG is a very useful transaction for identifying any further details about the origin or conditions of an error. I recommend checking this first to try and discern the nature of the error.

 

error_context.png

 

Viewing the Error Context in the log, we were able to identify the called URI as a valid request. Our next step was to circumvent Gateway and run the RFC with the provided criteria directly. At first glance, the RFC appeared to be sending a valid response.

 

As part of the SAP NetWeaver Gateway system, classes are generated given your RFC and mapping decisions. At this point in debugging, it can be very useful to break into these, set breakpoints to trace the exact point of failure, and  identify the culprit.

 

From the Object Navigator (se80), select “Class / Interface,” and search for your data model with a wildcard (*) on the end. You will see a list of related classes. For this case we selected the entry that ended in “_Q_D.” The “Q” represents Query, and “D” is the subclass that contains the methods we need to access. Navigate to “Methods,” “Redefinitions,” and open “/IWFND/IF_BEC_BOP~EXECUTE.”

 

se80.png

 

The line “call function mv_rfc_name destination mv_destination” marks the start of the code where the outside SAP RFC is called and the return is mapped. Place a breakpoint somewhere prior, and step through watching, in particular, the value of “sy-subrc” and “ls_response.”

 

unmod_code.png

 

Within “ls_response,” we were able to identify that some returns included a table row with a NULL value. As this response was mapped in Gateway as a key, the mapping will obviously fail for some values and ultimately produce a run-time error.

 

For a quick, easy fix, we edited “/IWFND/IF_BEC_BOP~EXECUTE” following the function call and added in a check for a NULL line and removed it if it was found. A more long standing fix would be to wrap the RFC in SAP and return a string type to avoid dealing with a table query.

 

Screen Shot 2012-08-29 at 2.03.34 PM.png

 

Conclusion

In dealing with any service issues, my first advice is to check the S_COR_ID-VALUE mappings and be sure they are non-NULL and unique for all cases. When in doubt, check the logs and use the debugger.

 

 


PowerBuilder and Gateway - The Sinatra style

$
0
0

The other day I was playing with the idea of consuming Gateway from PowerBuilder...so of course, I tried a few things like reading it as a WebService or a Rest Service...none work as Gateway generated OData.

 

PB_GW_01.png

 

A couple of days ago I read this awesome blog by Mark Bradley called Gateway over PowerBuilder where he was using an OData Service DataWindow...which I didn't found on my PowerBuilder IDE...I contact Mark and he told me he was using a "not released yet" version of PowerBuilder, so my new goal was to find a new way to connect PowerBuilder and Gateway using what I currently had...

 

I tried a lot of things more...including the WCF Data Services for OData which didn't work at all...

 

But as part of my job is break my head trying to achieve the most crazy and cool ways of doing things...I decided to took another approach...

 

I remembered that ruby_odata is capable of consuming Gateway, as I was one of the one that contributed to that project Ruby, Camping and...Gateway? (Sorry...will fix the code as soon as I can...old post)

 

Then I learned that Sinatra the classy Ruby WebFramework was capable of exposing data as a Rest Service (You need to install also the JSON gem)...so all pieces we falling together...

 

I wrote a small Ruby/Sinatra script to read from Gateway (For this example, I made the service anonymous...just to type it too often)...BTW...I call it Sinatra_JSON.rb

 

PB_GW_02.png

After launching it, I could check it on my browser...

 

PB_GW_03.png

 

With that ready...I could move to PowerBuilder...create a Solution --> Target (Specify that you want a window to start) --> And then a RESTFUL Client.

 

PB_GW_04.png

PB_GW_05.png

PB_GW_06.png

PB_GW_07.png

After that, we need to Generate the Proxy...

 

PB_GW_08.png

 

And create a Grid DataWindow...

 

PB_GW_09.png

 

We need to define the columns that we're going to retrieve and show in our window.

 

PB_GW_10.png

 

We create a DataWindow inside our Window (w_window) and a button. We're going to drag & drop the Grid DataWindow into our DataWindow.

 

PB_GW_11.png

 

Double click the button and paste the following code...

 

PB_GW_12.png

 

Go to the application and double click on it, paste this code...

 

PB_GW_13.png

 

We're ready to run our program, and press the "Call Flights" button...

 

PB_GW_14.png

 

It works! So as a little wrap up...we read the Gateway service using Ruby_OData and expose it as REST service using Sinatra. From PowerBuilder we create a REST client, consume the Sinatra REST service and render it on a Grid DataWindow...

 

Hope you like it...so far...I guess is the best way to make PowerBuilder and Gateway work until we can put our hands on the latest PowerBuilder release.

The only drawback of course, is that I need to pass the filters manually instead of passing them dynamically...well...next time maybe...

Netweaver Gateway & OData: URI conventions in practice

$
0
0

Over the last year I’ve had various exposure to Netweaver Gateway; the usual “so what is this all about?” tinkering and self-teaching, actual project use and even training of junior consultants.

 

Prior to this I had been working on some R&D with a custom REST dispatch layer, i.e. doing RESTful SAP access without Gateway. The main bone of contention in the abstract design was the URI scheme – just what was the best way to structure resource paths and pass key values, parameters, etc. to the backend? In this respect REST is open to interpretation since it is not a protocol.

 

When it was announced that Gateway would use OData we were still some way off of getting a useable product, but it got me thinking: if I were using the OData protocol, how would I incorporate it into my dispatcher? It was pretty clear that an OData parser would be required to turn the URI scheme into something that the average non-webhead ABAP guy could understand.

 

With the arrival of Gateway Builder, it is now much less of an effort to put together the implementation layer of the data provider, and the model provider classes can pretty much be left alone. Now we have more time to delve into the OData interface and discover exactly what OData can do with ‘standard’ method results and, which OData scenarios require some extra understanding and ABAP effort.

 

To this end, I’ve been working through the OData URI conventions and comparing these to the Gateway OData implementation.

 

Basic entity addressing

OK let’s clear something up first. Always keep the cardinality 0:n in mind. Even if you know that an entity has a 1:1 cardinality, the OData protocol is taking  0:n as a baseline.  This makes sense as 0:n has to encapsulate all the other cardinalities.

The practicality of this means I should be referring to all entities in plural terms.

 

resource_path/service/foos

 

could return no foos, one foo or a whole basketful. That’s the point, it’s meant to work that way. Suppose I get several foos and want to reference one  (foo9) and navigate into it?

 

Well, I’ve actually got a navigation link to it in the result feed, so I don’t need to worry about that. Assuming I didn’t have that link, the logical addressing would be:

 

resource_path/service/foo(‘foo9’)

 

I’m referencing a single resource so I don’t need to bother with the entity set syntax now, do I...

 

WRONG!

The above would be thrown back by the OData parser telling me that the resource ‘foo’ does not exist.

 

I have to sideline English and logic here and return to my 0:n way of thinking. The correct syntax is:

 

resource_path/service/foos(‘foo9’)

 

Dammit – so I’ve got to go through the get_entityset method again? No, I don’t.

 

Although I am using the plural syntax, the simple addition of the key specifier section – adding  (‘foo9’) after the resource name - tells the parser that it needs to send this request to the get_entity method in my data provider.

 

 

Intrinsic System Query Options

 

The OData standard specifies a set of system query options that you can apply to any resource request; that is, the result will be often be different to the vanilla request on the same node.

 

Gateway implements some of these system query options intrinsically; provided that you have working GET methods for the entity and entityset you can expect the OData parser to transform the result for you.

 

Here are the intrinsic options.

 

Property extraction - single

 

Assume that the ‘foo’ entity has these simplified properties:

 

<EntityType Name="foo" sap:content-version="1">

<Key>

<PropertyRef Name="id" />

</Key>

<Property Name="id" Type="Edm.String" MaxLength="10" sap:label="Foo ID" />

<Property Name="name" Type="Edm.String" MaxLength="30" sap:label="Name" />

<Property Name="category" Type="Edm.String" MaxLength="10" sap:label="Category" />

<Property Name="price" Type="Edm.decimal" Precision="9" Scale=”2” sap:label="Price" />

<Property Name="currency" Type="Edm.String" MaxLength="5" sap:label="Foo ID" />

</EntityType>

 

It is possible to extract just one of these properties as a result. The option can only be applied to a single entity URI.

 

resource_path/service/foos(‘foo9’)/category               //obtains the category of an entity identified as ‘foo9’

 

N.B. there are further formats of URI that can access single values at different levels of navigation – here I am just presenting the basic form against a simple primary entity address. I hope to revisit these later.

 

 

Property extraction – SETS ($select)

 

Sets of properties can be extracted, these are obtained with the $select query option. The option can be applied to any URI addressing an entity or a set.

resource_path/service/foos?$select=name,price,currency

resource_path/service/foos(‘foo9’)?$select=name,price,currency

// property section only contains request properties

<m:properties>
  <d:name>foo-nine</d:name>
  <d:price>380.00</d:price>
  <d:currency>gbp</d:currency>
</m:properties>

 

Instance count - ($count)

 

The $count option is similar to the Open SQL count in that it will return a count of the query results and no data (note that in the URI, $count sits behind a final ‘/’, not a ‘?’ operator like other options do!)

 

 

resource_path/service/foos/$count 

 

The $count option only makes sense when applied to an entity set address, but it will still work (sort of) on the access to a single entity, since it operates on the assumption of an overall 0:n cardinality.  I say ‘sort of’ because if the entity does not exist for the requested key, the count is still 1. That is semantically incorrect, since a client should be able to run an existence check on a key and determine true or false from the count result.

 

Why do we get a count of 1 when the key is wrong? It is technically correct because the response feed contains an empty property set after it comes back from the GET method. There doesn’t seem to be a way to make it return zero; if you modify the DPC class method to not transfer the data, you get an inner error reporting the resource is not found. That is “correct” but I’d rather not have inner errors used to report non-existence since they are logged as server errors.

 

One point to note on the above regarding performance : the property extraction/counting is done after the method call, applied to the results. Your method has no inkling that a reduced set of properties is being requested therefore it will always obtain the full property set. Until SAP change the design to transfer this level of request information to the backend, the ABAP layer will not be able to provide faster access to the data.  This is technically very similar to WDA contexts that are based on Dictionary structures – the whole set of columns are present in the background and the meta-model maps to those attributes used in the context, but in DB terms it’s always a full width access.

 

There are some further intrinsic query options; $expand and $links -  however they only have any value once you start navigating between entities. I’ll cover these in the next installment.

 

 

System Query Options requiring ABAP extensions

 

Now we come to the system query options that won’t work unless you provide some ABAP that anticipates and executes them.

 

Entity Set Paging ($top and $skip)

 

These options are at once the simplest to understand and possibly the hardest to implement within an enterprise service.

 

$top limits the amount of results from a set. This is a GOOD THING and should be mandatory in my opinion. Having an open query could be disastrous where the potential datasource is massive.

 

resource_path/service/foos?$top=200

 

Developing an entityset feed should always involve some kind of $top consideration. The value for the $top option is fed into the ABAP method via IS_PAGING-TOP.

 

You could design the ABAP logic to assign a default value for $top if the requester doesn’t set it, rather than having an unlimited retrieval, but I don’t believe that’s a good idea. How would the recipient know we had limited the results?  In short, the client should be required to send a $top value  and the acceptable maximum top value should have a limit . How this is implemented and enforced  is up to you guys

 

Thorny issues don’t stop with $top. So I got the top x entities, I want to get the next bunch.  In order to get a new collection, I can ask the server to skip n entries using the $skip option.

 

resource_path/service/foos?$skip=200

 

However that takes us back to the ‘how many?’ question posed by $top. There’s no point limiting the first query to 1,000 entries then letting the client skip past those and get a million.

No, $skip by itself is pretty useless, dangerous and other nasty things. Therefore another of Ron’s Rules is that any $skip still requires a $top value to be sent with it. Yes, you can do this – this is the first mention of the fact that you can combine some of these options. It also matches the paging concept as you expect a discrete number of rows per page.

 

resource_path/service/foos?$skip=200&$top=200


By the way, the value for the $skip option is fed into the ABAP method via IS_PAGING-SKIP – in case you hadn’t figured it out when you checked out its cousin ’TOP’.

 

As if that wasn’t bad enough, things are now complicated by the architecture. The requests are – by design – stateless as far as the server is concerned. A $top option in a request is not too bad and easily ABAPed.

 

$skip on the other hand – ewww! It is stateless, which means that the only way you can relocate your “logical cursor” to satisfy the request is to go back to entity 1 and read your way forward $skip entries. Other than adding a stateful component on the server side that is linked to the GET method I cannot see how this mechanism for traversing through paged data can be good for performance.

 

On top of this, the ordering of any logical pages has to be consistent so that you get the same instances in the same location in each page. That’s one for individual implementers to decide on…

 

…which brings us to $orderby. This allows us (via IT_ORDER in the method interface) to specify a sort order. Since OPEN SQL has an ‘order by’ clause you will probably be tempted to use that however I’d be really worried that I’d be missing something in these disconnected queries. I never use that clause anyway as I was warned it would place too much burden on the DB server.

 

Since there are other features of OData that can satisfy queries with better performing results, I would limit the use of $top and $skip to “document model”  type resources that would typically have a few dozen ‘pages’ but could still have a few thousand and work well. 

 

Well I think I’ve gone on long enough about those three – mainly to get you thinking about the pitfalls those options can open up. There seems to be a pattern there – the URI scheme OData suggests that the backend requirements for something like $top are quite lightweight and therefore the implementation should be light. Quite often it’s the case that the simplest options invoke the heavier load on development and resources.

 

In my next blog I intend to look at progressing to more complex and advanced topics such as navigations and better query options.  

OData Everywhere

$
0
0

We're well into Day 1 at SAP TechEd 2012 in Madrid, and while SAP NetWeaver Gateway has already been mentioned in this morning's keynote (even though the keynote was more Sapphire-focused than TechEd-focused), and is noted as an enabler in various conversations public and private, there's a particular part of Gateway that is shining through as today's story for me: OData.

 

Just now, I attended the SAPUI5 Q&A session with Tim Back and Oliver Graeff, where they presented a great overview of the libraries, tools and features of what is becoming an ever more popular platform for outside-in UI development. After all, it's almost policy at SAP to use SAPUI5 for development projects, where appropriate. ("Where appropriate" means in many circumstances except probably heavy power user application UI paradigms).  One of the key features of SAPUI5, and in particular the DataTable controls, is the ridiculously easy consumption of data. In particular, data made available by Gateway, in the form of OData. Sure, as I've noted before, SAPUI5 can consume arbitrary XML and JSON too, but the data exposed in the related, resource-oriented fashion by Gateway, OData in other words, is where the magic happens.

 

Start with controlled definition of resources, and the relationship between them, done in your systems of record using the IW_BEP backend Gateway component building Model and Data providers, either manually or using the Service Builder. Then expose those resources and relations to your UI developers using the core Gateway components (GW_CORE and IW_FND). Then, you're off. Within no time you can start to see an application form around that data, with the right layer performing the right function with minimum friction. And that speed comes from the investment SAP has made in OData, an investment to make it all pervasive and all consumable.

 

So we know about Gateway being a key mechanism to expose OData for ABAP stack systems. Is there anything else? You bet. SAP HANA, full of data, can expose that data in an OData context. Use the magic of xsodata, create a definition marking a HANA table or view in a schema as an Entity,  and *boom* you have a consumable OData service. And it doesn't stop there. There are facilities in the NetWeaver Cloud to produce OData too.

 

What does all this mean? Well, to me it means two things. The first thing is that it means that Gateway has already been a great success. It Just Works(tm). I recently completed a customer project which went live earlier this year, and Gateway was a key component in the integration architecture. And after setting Gateway up and defining our entities and the relations between them, we moved up a layer in the stack and never really had to work hard on Gateway at all. It did exactly what it said on the tin. We started to use, and reuse, entities that we'd defined, in building out the features in the consuming application.

 

The second thing is how important your investment in Gateway is. Embrace Gateway and by definition you're embracing OData. Before you know it you and your fellow developers are conversant in Entities, Entity Sets, Associations and Navigations (the relationships) -- the building blocks of information in OData. And while this is a super end in itself, you're also setting yourself up to move out into the cloud, and across onto HANA. Have a look at the speed with which you can put together an app that consumes data supplied to it from Gateway. And then consider you're investing in that speed, and that speed across platforms.

Enhanced OData Parsing

$
0
0

I recently posted a blog regarding the ABAP implementation of various aspects of OData URI schemes. While preparing for part two of that series I came across some aspects of Gateway OData handling that seem to be lacking, so before I continue on to more in that series I thought it would be good to share how I’ve dealt with this (code junkies will need to page down a bit for that).

For the record, I’m discussing the state of play with Gateway 2.0, SP5. 

What’s bugging me?

 

Currently, the service document and metadata publish the basis for a contract that the OData client should be agreeing to. These contain SAP annotations that further stipulate the access conditions. These annotations are in the SAP namespace. I also cannot find any public documentation of what the annotations mean – if it’s there it’s in some obscure location. It certainly isn’t available at http://www.sap.com/Protocols/SAPData at the time of writing. 

 

It would therefore seem that it is up to us (Gateway service implementers) to explain the rules that annotations are setting out.  I’m not going to get into all of them, but the ones of interest here are:

  • Addressing conditions
  • Filtering conditions

 

Addressability

There is an option in the Gateway model to restrict direct addressing of an entity set. This is a good idea if the entityset can be large and should really be accessed in the context of a preceding entity. For example, sales order items are not normally accessed without knowing the order header that they belong to. If we granted unlimited access to an entityset called ‘salesitems’ it would probably run for a week.

 

The ‘addressable’ annotation is a Boolean that decides the mode of access.

 

image001.png

image002.png

 

Here is where the annotation states for entity sets are set in Gateway Builder. Also note the ‘Req. filter’ annotation – more on that below.

If we turn on the addressability option, we can restrict the access – cool! Let’s make ‘foos’ addressable regenerate and have a look at the service document…

 

image003.png

What the …?

 

Why was the addressable setting turned off for all my newly modeled entities? That should mean they can’t be reached in the first place, but I know from testing that they worked fine. Furthermore, when I made foos addressing active, its annotation disappeared from the service document.

 

Purely from an OData client perspective, this is interpreted as:

·      Entity sets, by default, are not addressable. This is obviously not right!

·      If there is no “addressable” annotation, assume that its value is true. This only makes sense if the above point operated correctly.

 

So the client has this fuzzy spec and may not take any notice of it.

 

‘bars’ is supposedly not addressable and should be reached by a navigation from a foo (foo is parent to bars). Guess what – ignoring the “contract”, I can get ‘bars’ and it dumps thousands of them back to me. There is no runtime check on this “rule”.

 

I’d prefer something that looks like a rule configuration option (some will argue that it’s not a rule) to actually have some enforcement. I suppose I could live with that except there’s another itch I need to scratch…

 

Filter annotations

‘foos’ are my root in the model and I want to be able to run some sort of query on them – in fact, I’m going to tell the client that they can only access foos by filter conditions that my model dictates. This is to protect my server in terms of performance. Luckily, there are annotations that I can use to tell the client all about this.

As mentioned above, there is a box for this too.

image004.png

 

I may also specify which foo entity properties can be filtered. Here I choose ‘Category’ and ‘Currency’ – so I can only have one or both of these properties in a ‘foos’ filter string.

image005.png

 

Service document now states:

image006.png

 

And metadata for the foo shows:

image007.png

 

Did you spot it? Yes, while the service document is willing to tell us what is not filterable, it’s somewhat circumspect in telling us what IS filterable! At least there’s a pattern emerging, even if it’s not official:

If it’s not false it’s true.

 

Fair enough and not particularly original – except if all my properties were filterable the client wouldn’t even have a notion that there was any kind of distinction.

‘foos’ is supposedly filter-only and should be filtered by category and/or currency. Guess what – ignoring the “contract”, I can get ‘foos’ without a filter and it dumps thousands of them back to me. I can use a filter on any property and it dumps thousands of them back to me.

 

There is no runtime check on either of these “rules”.

Trusted client?

 

Picking holes in all of the above may be pedantic, after all we are developing the client and know about this stuff. Oh I forgot, the idea was to open up SAP data to non-SAP clients wasn’t it! We can expect a knock on the door from Bill’s Witnesses.

Given that the annotations spec seems non-existent, are we really going to trust clients to abide by the contract? What we need are some of those pesky messages from SAP telling them that they’ve been naughty – because they will be!

 

 

Let’s get coding!

 

What are we coding? I’m going to show you how I solved the absence of the rule validation for addressability and filtering.

 

The following enhancements are shown for guidance and are copied at your own risk; since they are within a core component you should satisfy yourself that the enhancements work with all viable URI formats.

 

The primary checks on entity set rules are actually quite easy to implement and both can be placed in one enhancement.

We can intercept the request at the point where the Gateway creates it as a URI object and validates the form of the URI. The class for the URI object is /IWCOR/CL_DS_URI.

 

Various generic URI checks are made in the constructor of this object so it’s surprising that it wasn’t extended to look at some of the more particular entity settings.

 

One of the methods called by the constructor is handle_entity_set. This validates the URI in terms of what entity set, if any is being addressed. The validation is on the OData protocol compliance, i.e. is the URI in the right format and does it contain an entity, key, system query options, etc. However, at this point the entity set name has been extracted so it is just a matter of reading the set definition and comparing it to the URI content. After a bit of investigation I decided that it could be enhanced so I went ahead.

 

I opened an implicit enhancement at the pre-method point in handle_entity_set.

 

 

* Check for the need to filter entitysets according to model definition.

 

data: zz_entity_set type ref to /iwcor/cl_ds_edm_named,
    zz_annotation
type ref to /iwcor/if_ds_edm_annotations.


* when the key is blank, this is an access on an entityset, not a navigation through it.
  if iv_key_predicate is initial.
*   cast to annotation interface
    zz_entity_set ?= io_entity_set.
*   get the annotation object from the set
    zz_annotation ?= zz_entity_set->/iwcor/if_ds_edm_annotatable~get_annotations( ).
    if  zz_annotation is bound.

*     check the filter requirement. If the definition requires a filter, there should be a
*    ‘$filter’ key in the parameter list.

     
if zz_annotation->get_annotation_attribute( iv_name      = 'requires-filter'
                   iv_namespace
= /iwcor/if_ds_edm=>gc_namespace_sap ) = 'true'.
       
read table mt_query_parameter
         
with key name = '$filter' transporting no fields.
       
if sy-subrc <> 0.
         
raise exception type zcx_ds_uri_syntax_error
             
exporting
                textid 
= zcx_ds_uri_syntax_error=>entityset_requires_filter
                segment
= zz_entity_set->/iwcor/if_ds_edm_named~get_name( ).
       
endif.
     
endif.

*     check addressability

*     assume that sets are addressable unless a 'false' is specifically declared.
     
if zz_annotation->get_annotation_attribute( iv_name      = 'addressable'
                iv_namespace
= /iwcor/if_ds_edm=>gc_namespace_sap ) = 'false'.
         
raise exception type zcx_ds_uri_syntax_error
             
exporting
                textid 
= zcx_ds_uri_syntax_error=>entityset_not_addressable
                segment
= zz_entity_set->/iwcor/if_ds_edm_named~get_name( ).
     
endif.
   
endif.
 
endif.

 

 

N.B. If the URI contains a key predicate, e.g. “(key=‘somekey’)”, the left-side cardinality is ‘from 1’ so any filtering considerations would apply to the right hand side, which is not dealt with at this point in the URI handling. The right hand side is most likely a navigation and that opens up the question of navigation set filters which would require an enhancement in the appropriate spot.
This enhancement only deals with immediately addressed entity sets.

 

I have also subclassed the GW exception CX_DS_URI_SYNTAX_ERROR so that I could a) add some more meaningful messages, and b) have the Gateway core trap the exception as if it were standard.

 

The addressable requirement check needs nothing further to be added.

 

The filter requirement can still be checked in more detail. If a filter is marked as required, I can then check that the properties in the filter are ones nominated as filterable.

 

The code for this goes into the “<entity>_get_entityset” method of the data provider extension;  no enhancement is required.
The following check makes sure that anything declared in the filter is a “filterable” property.  A business rule exception is raised if the filter input is incorrect.

 

data: lo_model type ref to /iwbep/cl_mgw_odata_model,
        lo_metadata_provider type ref to /iwbep/if_mgw_med_provider,
        lv_internal_service_name  type /iwbep/med_grp_technical_name,
        lv_internal_service_version type /iwbep/med_grp_version,
        lv_ent_id type  /iwbep/if_mgw_med_odata_types=>ty_e_med_entity_id,
        ls_entity type ref to /iwbep/if_mgw_med_odata_types=>ty_s_med_entity_type,
        lv_property_name type /iwbep/if_mgw_med_odata_types=>ty_e_med_entity_name,
        ls_property type ref to  /iwbep/if_mgw_med_odata_types=>ty_s_med_property.


 

  field-symbols: <filter> type /iwbep/s_mgw_select_option.

* If a filter is supplied, cross check the inbound properties in the filter against the model
* definition.

 

  if it_filter_select_options is not initial.
*---get service details from context
    mo_context->get_parameter(
      exporting
        iv_name  = /iwbep/if_mgw_context=>gc_param_isn
      importing
        ev_value = lv_internal_service_name ).

    mo_context->get_parameter(
      exporting
        iv_name  = /iwbep/if_mgw_context=>gc_param_isv
      importing
        ev_value = lv_internal_service_version ).

*   Get a model instance
    lo_metadata_provider = /iwbep/cl_mgw_med_provider=>get_med_provider( ).
    lo_model ?= lo_metadata_provider->get_service_metadata(
                   iv_internal_service_name    = lv_internal_service_name
                   iv_internal_service_version = lv_internal_service_version ).

*   Get the entity definition ID.
    ls_entity = lo_model->get_entity( iv_entity_name = 'foo'   ).

    loop at it_filter_select_options assigning <filter>.
      try.
          lv_property_name = <filter>-property.
          ls_property  = lo_model->get_entity_propertyiv_entity_id = ls_entity->entity_id
                                    iv_property_name = lv_property_name ).
        catch /iwbep/cx_mgw_med_exception .
          continue.
      endtry.
*     Exception if the filter switch is "off".   (Apparently ' ' is true in this context!)
      if ls_property->filterable = abap_undefined.
        raise exception type /iwbep/cx_mgw_busi_exception
          exporting
            textid       = /iwbep/cx_mgw_busi_exception=>filter_not_supported
            filter_param = <filter>-property.
      endif.
    endloop.

  endif.

How NetWeaver Gateway can help to mobilize business processes

$
0
0

We are all used to use our mobile devices for daily activities like checking the weather forecast, navigation through cities, booking train tickets and a lot of other stuff. Mobile devices became today’s Swiss knifes.

 

Therefore employees and managers expect that they can also use their mobile devices in an enterprise environment. But how can you connect your mobile device to data in your SAP ERP system? There is a need for a lightweight interface which is easy to understand for an app developer who doesn’t know anything about SAP and ABAP technology.

 

SAP’s answer for this is NetWeaver Gateway. With NetWeaver Gateway you can build RESTful web services which expose business data through an OData channel to the outside. Because the OData protocol is an open standard, no special SAP knowledge is needed to consume such a NetWeaver Gateway service.

 

We at bridgingIT thought about how mobile devices can make field technicians’ daily work easier and developed the solution “PM Radar” which consists of an iPad App, a Windows 8 App and a NetWeaver Gateway service. It combines data from SAP PM with location based information. “PM Radar” is just a template how you can combine easily the context of a mobile device with business data. “PM Radar” shows functional locations on a map depending on the current location of the mobile device. This makes it easy for field technicians to navigate from one technical object to the other. Because we assume that not every company maintains geo information data in its master data “PM Radar” allows it to select a functional location and add its geo information data to the master data in SAP PM.

 

Developing an SAP NetWeaver Gateway service – our experience

 

SAP NetWeaver Gateway provides the OData interface to the non ABAP world. There are several possibilities how you can implement the service. We used the OData Channel API to implement our service. NetWeaver Gateway others also the options to generate services based on BOR objects, RFCs, or Screen Scraping, but the OData Channel API is the most flexible.  At first we determined which SAP function modules can provide us the data we need from our ERP and which function module we can use to update the data in the ERP. We choose the way using RFC function modules because we didn’t want to code in the ERP system itself. Our NetWeaver Gateway runs on a separate ABAP system. You can also install NetWeaver Gateway on top of your ERP system. But you should think about if you want to connect your ERP system directly to the internet.

 

As a second step we defined the entity structure and fields we want to expose. Till now, non-SAP developers always struggled with the data structures and field names SAP interfaces offered. In SAP NetWeaver Gateway you can define a mapping of the field names. Internally you use the SAP field names in the ABAP code as usual. But the consumer of the service doesn’t see these names. To the outside you see only the names defined in the mapping. This makes it much easier for non-SAP developers because you can use longer field names as e.g. “CompanyCode” instead of “BUKRS”.

 

developers because you can use longer field names as e.g. “CompanyCode” instead of “BUKRS”.

Next, we created our ABAP class implementation. Our class inherits from super class /IWBEP/CL_MGW_PUSH_ABS_DATA. In transaction SEGW we assigned our ABAP methods to the corresponding HTTP operation. Finally we redefined the methods we need in our ABAP class and implemented them. In our case we call the RFC function modules in our ERP system in these methods.

 

Of course, this is a very high level description of how you implement a NetWeaver Gateway service, but it’s no rocket science! If you are familiar with the concept and the tooling, it’s quite easy and fast to develop a service.

 

For all non-SAP developers it’s really easy to use the generated OData service. There are APIs for several programming languages available and because OData is an open standard there is in general no difference between a NetWeaver Gateway service and any other OData service.

 

As you can see, SAP NetWeaver Gateway is a lightweight interface. It’s easy to consume these services for any developer who is familiar with RESTful web services. It can not only be used for mobile apps, it can also be used for integration scenarios with desktop clients. It makes life much easier for developers: the ABAP developer doesn’t need to explain the data structure in detail anymore and the non-SAP developer doesn’t need to deal with the ABAP field names. As SAP NetWeaver Gateway is also part of Enterprise DUET and SAP will add an OData channel to many products in future, it’s really worth to get familiar with this tool.

 

Unable to execute query operation

$
0
0

Hi Experts,

 

I have created an Odata service via Gateway Service Builder..I am able to execute the instance method(Read Operation) but I get an exception when I try to Get the EntitySet (Query Operation).

 

<code>F2/718</code>

<message>No customer was found with these selection criteria</message>

 

I have mapped BAPI_CUSTOMER_GETDETAIL2 (for read) and

                         BAPI_CUSTOMER_GETLIST (for query ).

 

I also checked that back end data is available and proper configurations are also made.

 

Please help me to resolve this.

 

Thanks

Kawish.


Creating a SAPUI5 starter application with the Gateway Eclipse plugin

$
0
0

Having just installed and tested the GWPA plugin for Eclipse, I found the process of producing a basic application very straightforward. Even so, I thought it would be worth producing some content to cover the subject as there is nothing in this space at the moment.

 

I assuming that readers have Eclipse installed - I am using Juno Build 20130225-0426.

 

Eclipse plugin setup

 

If you do not have the GWPA and SAPUI5 plugins, obtain them here: https://tools.hana.ondemand.com/juno. Use the "Install New Software" function in Eclipse Help and point the installer to that address.

There are four sets of plugins, the ABAP and HANA Cloud parts are not needed to follow this blog.

 

Note #1: if you have the previous plug-in release, un-install it first. If you don't, you can find that it overlaps with the new features and some tree-nodes, etc. in Eclipse tools will be duplicated but contain different content.

 

Note #2:I had a problem with the Android feature install, to get around this I simply deselected it from the install as I do not plan to build any Android apps right now.

 

 

Let's Build!

 

Start Eclipse and open the new OData perspective.

          perspec1.pngperspec2.png

 

Connect to Gateway

A Gateway server instance is required to provide a service catalog feed. GW server connections are maintained here:

          scat setup1.png Click the 'G' connection to create a new server connection

 

( If this view is not visible, open it from the window menu: )

                                                                                          scat setup0.png

 

Enter the server details an OK it. The connection will tested.

 

          gwdetails.png

 

If the connection establishes, you get a new link.

 

 

new conn.png

Click to expand the list...service list.png

 

 

Check that the service you are basing your application on is present - if not you will need to check the registered services in the server itself. We are going to use CATALOGSERVICE (if that's not present you have a major problem).

 

 

Start a new project

 

          new project.png

 

Name it and choose the base type of project and template - we use HTML5 and opt for SAPUI5 template (desktop/browser style).

 

         project name.pngproj type.png

 

The next step is to set the service URL - enter it directly or use the Service Catalog browser to choose it.

 

          choose service - gw.png

               choose scat.png

 

Once chosen, the service model details are shown; you can check here that you do have the right service.

 

scat chosen.png

 

 

Adding views

 

The view wizard will now place a default view definition in front of you. The default entry point into an OData service is always a collection so the wizard sets the view type to 'List'. It will then locate the 'addressable' entitysets in the model - if a set is not marked as addressable, it can only be reached via a parent entity.

This setting is also seen in the GWPA tools ('clientPageable'?)

 

view 1 - set coll.png

 

We could jump straight to the Service collection but I will go in by the CatalogCollection.

 

Once I chose this, I need to chose the fields for the view with 'Add'. The fields of the relevant entity, based on my set choice, will be presented for selection. I tick the ones I wish to display.

 

add fields.png

 

I get a lost of three fields but I want the columns in a different order, so I use the Move tool to reorder them.

 

move up.png

 

View 1 done, but that doesn't show a lot. The meat of the data is in the service list, so lets create a view for that.

 

The view hierarchy follows the model hierarchy; we can only proceed to a new view if there is an association (relationship) from the current view to the next entityset, or if we wish to see the detail of a set member. This is only true for the template model, you have much more freedom once you start to write your own view controller logic.

 

Add a new view with the 'Add' button on the views column. A new view proposal appears - this one is a detail view for the catalog set member; that's not what we want here.

 

add view 2.png     default view2 not wanted.png

 

First change the view type to 'List'. There is a possible navigation to a lIst so this option is available, otherwise it would be Detail only.

 

view 2 change type.png

 

There is a navigation property called 'Services' which navigates to the service list. Name the view to match.

view 2 as list.png

 

Repeat the steps for selecting and ordering fields for the view.

 

view 2 added ordered fields.png

 

Finally we can add a third view which is a detail view of the service entity (you should be familiar with the steps now).

 

view 3 done.png

 

Finish the wizard, and your project is built. You should get something like this in the Project Explorer (not to be confused with Navigator, which is similar but not as detailed).

 

project built.png

 

My SAPUI5 libraries are locally hosted, so I need to alter the bootstrap script in 'index.html' so that the resources library link is correct for local execution. If you are working with libs hosted on the AS, I think it should work with no edits (our server installation of UI5 libs is in progress so I cannot confirm this time of writing).

 

bootstrap1.png

 

For the same "local" reason, I also need to edit the connectivity.js file because the GW server needs to be accessed by a proxy and the absolute address has been hardwired into the file.

 

          service url 1.png     service url 2.png

 

Further background for this local development environment can be found on DJ Adams post over on http://scn.sap.com/community/developer-center/front-end/blog/2013/01/31/getting-started-with-sapui5-alternative-guide (and thanks for the extra help Deej)

 

Make sure all of the changes are saved and we are good to go for testing.

 

Test

 

http://localhost:8888/ui5proj/Service%20Catalog/WebContent/index.html

 

or your equivalent pointing to the application WebContent, should yield:-

 

View 1

 

ui51.png

 

View 2

 

ui52.png

View 3

 

ui53.png

 

Note: you may get an http error if you drill down into detail on any '/IW*' services. I think this is a problem with the key containing escaped values for the '/' parts. Any custom services should be fine, as in the example.

 

This should provide a good starting point for learning the basic internals of SAPUI5 scripting.

OData custom parameters - Gateway can have them too!

$
0
0

Since starting with Gateway I've always wondered why one of the simplest OData conventions is not implemented in Gateway yet.

 

Simply stated, at the end of a resource path you can add typical query string parameters that you would see in other URI schemes -makes sense.

 

e.g.  <resource_path>/<resource_endpoint>?sap-language=EN

 

If you call a service with this kind of URI scheme, it won't object. Nor will it pass those parameters to your backend provider.

 

From previous REST developments in SAP, I knew that the query string was probably being accepted and even present, so why was it lost?

After a debugging session that would have benefited from 3D googles, I found the answer.

 

a) Gateway uses the query string for other "stuff".

b) If that other stuff, such as $expand is present, then it gets processed.

c) If what is in the query string is not within an elite group of parameters, it is unceremoniously discarded.

 

I thought this was rather harsh - they are only trying to be helpful. Time to be helpful in return.

 

I want my parameters. I planned them, raised them and sent them off on a journey and I want to see them arrive safely at Uncle Backend's home. Uncle Backend will put them to good use and probably keep me in his will.

 

Now I know the following is probably going to dislodge a very firm stick from a very tight posterior socket somewhere, but I decided to do something about it.

 

 

"Get your custom parameters here!"

 

The first step was to enhance the ODATA processor, namely class /IWFND/CL_SODATA_PROCESSOR, Method INIT_REQUEST. If you have separate GW and BEP, this class will be in the GW instance.

 

This was done with the following implicit enhancement on the post-exit.

 

postexit.png

 

The basic story here is that the exported table 'et_parameter' is responsible for persisting the parameter name-value pairs through the request cycle.

It is at this point that all will be lost if we cannot grab them - this piece of code does that. Anything that was in the query string is saved.

 

How do you access them in the backend? Well, they aren't exposed in the CRUD method signatures, but they are present in the provider runtime object - you just need to know where to find them.

Or write a method in the data provider to read them - like this:

 

method.png

 

It is then just a case of calling this method with a parameter name to see if it is present and get its value.

 

call.png

 

 

Rather than write this method into every DPC class, I'd suggest putting into an interim custom class inserted into the class inheritance if possible. That way it will appear in all your DPC's. It's easy to do in SP03 but a bit harder in SP04 using SEGW-generated code, I suspect.

 

I have used this enhancement with no ill effects on a live project and both SP03 and SP04 versions, but anyone wishing to use this solution does so at their own risk.

 

Regards

 

RS

Microsoft BUILD 2013

$
0
0

build.jpg

 

As you know SAP NetWeaver Gateway is all about developers. This said it is not only about SAP developers.

The Microsoft BUILD developer conference 2013 will start in San Francisco tomorrow.

In this blog I will post about some of the news from BUILD 2013.

 

The official homepage can be found here: http://www.buildwindows.com/

Follow the event on twitter: https://twitter.com/bldwin

Interesting article by ZDNet: http://www.zdnet.com/the-build-up-to-microsofts-build-2013-7000016901/

 

Day 1 - The start button is back...

 

Day 1 started - no surprise here - with a keynote by Steve Ballmer.

One of the leading topics of the keynote today was the rapid release cadence and its importance to Microsoft in today's world.

To 6000 developers at the build conference - and countless online - Steve presented the tools and devices to make this happen.

 

One of the most anticipated "tools" is and was Windows 8.1.

With a great applause the audience welcomed the fact that the start button will be back in Windows 8.1. In addition it will be possible to directly boot to the desktop again.

 

You can download a Windows 8.1 Preview here.

A Windows 8.1 Product Guide for App Developers can be found here.

Finally Tools and SDK can be found here.

 

This said your way is open to build interesting Gateway apps on Windows 8.1.

 

In terms of new devices Steve presented different devices (e.g. Nokia's Lumia 928, 925 and for outside the US the 521)

Sprint will add Windows Phone devices to their portfolio (e.g. HTC Windows Phone 8XT, Samsung ATIV S Neo) and thereby broaden the reach of Microsoft's Windows phones.

 

Interesting to developers is the fact that Steve stressed the importance of touch for modern applications. At the same time he stressed the fact that is has to mix with existing user paradigms. Same is true for applications. In addition to modern apps - download through Microsoft's marketplace - it was made clear that classical desktop apps are not forgotten and that the mixture is important to the overall experience.

 

In terms of reach of Microsoft apps it might be interesting for you to read that Microsoft expects to pass 100,000 applications on the market in this month.

 

In addition to Steve Julie Larson Green, Antoine Leblond, and Gurdeep Singh Pall presented the latest and greatest.

E.g. Visual Studio 2013 (Preview available here), new bing features, and gesture control for Windows 8.1 devices.

 

Rusty McLellan and Dave McCarthy finally showed the Spark game and development environment for games with a very interesting approach to multi-channel usage of applications - even though a game here still interesting to see.

 

Complete keynote

 

Welcome to the show (by Steven "Gugg" Guggenheimer - Microsoft VP and Chief Evangelist)

 

Day 2 - There it is: SAP in the keynote

 

Satya Nadella - President, Server & Tools Business - lead through the keynote today. In contrast to yesterday's keynote the focus today was more on backend systems and enterprise applications. Interesting to notice is the fact that the cadence for backends is even faster than for the frontends.

 

More specifically the keynote was mainly about 'the cloud for modern business'.

 

The presentation started with some impressive numbers:

  • >50% Fortune 500 companies using Windows Azure
  • 3.2 million organizations active directory accounts with 68 million users
  • 2x compute + storage every 6 months
  • 8.5 trillion storage objects
  • 900k/sec - storage transactions per second (2 trillion/month)

 

To prove the battle readiness of Azure some additional numbers were provided:

  • XBOX Live - 48 million subscribers
  • Skype - 299 million connected users
  • Outlook.com - 1 million users gained in 24 hours
  • Outlook 365 - Nearly 50 million Office Web Apps users
  • SkyDrive - 250 million accounts
  • bing - 1 billion mobile notifications a month
  • XBOX Live (again) - 1.5 billion games of Halo

 

After the introduction the keynote took a closer look at the different channels in that context.

 

Starting with the Web the integration with Visual Studio 2013 was shown in addition to ASP .NET improvements.

It was stressed that building Web apps is pretty simple and suports out of the box support for different browser (not only IE).

 

A closer look at the mobile channel completed the multichannel story. The deep integration into the development environment was well demonstrated.

 

Scott Guthrie - Corporate VP Windows Azure - then presented the autoscale feature of Azure and what it means for services such as Skype.

With autoscale feature for Azure costs can be reduced dramatically (>40%). Basically the feature allows setting rules for scaling your cloud environment (i.e. adding and removing capacity and thus reducing costs to the necessary amount).

 

The keynote overall had so much information that it is pretty hard to choose the most relevant one. However let me finish with the demo of biztalk services connecting to SAP systems using REST.

 

The complete keynote will be available tomorrow - I will post the link then and provide the position of the SAP bit.

 

After the keynote - or second part of the keynote - Steven "Gugg" Guggenheimer - Microsoft VP and Chief Evangelist - presented how the Windows 8 architecture allows reusage of coding on the different Windows 8 devices. In addition supporting companies were mentioned - e.g. Walgreens offering API to Windows developers.

 

Heads up for tomorrow: no keynote.

NW Gateway: Tips & Tricks

$
0
0

In this blog I want to share some experience, I made with NW Gateway.

 

Content

  • NetWeaver Gateway documentation
  • Caching aspects while testing
  • Service analyzation & debugging
  • Various tips
  • Performance

 

Available documentation for Netweaver (NW) Gateway addon

 

- SAP NW Gateway 2.0 SP06 documentation

 

Since NW Gateway is part of Netweaver 7.4 release a separate documentation for now called "Gateway Foundation" is available as well.

- SAP NW 7.4 Gateway Foundation documentation

 

Caching settings while testing / development

 

While your application is still in test phase it's recommended to disable cache in all necessary systems (NW Gateway, possible middleware like SMP [SAP Mobile Platform] and your client). In Gateway you can deactivate cache via Customizing (transaction: SPRO). While you constantly test and extend your application/service this helps not to receive data which comes from cache. Even for performance test it's better to deactivate caches to see how your application works under real circumstances.

 

You'll find Gateway customizing setting here: Tx: SPRO >> SAP Netweaver >> Gateway >> OData Channel >> Administration >> Cache Settings

 

It's also possible to create cleanup jobs or clean cache manually via transaction /IWFND/CACHE_CLEANUP and /IWBEP/CACHE_CLEANUP.

 

Service analyzation & debugging

 

Browser debugging


You can debug your ODATA services like you do it with any other website. In Google Chrome you have an build in developer tool which can be activated via menu: View >> Developer >> Developer Tools

 

Most important tabs in Developer Tools for analyzing your services are Resources, Network and Timeline.

 

Timing / Error analyzes for ABAP methods


SAP gives you the possibility to trace execution time for all methods which are called during your ODATA request. Just insert "sap-ds-debug=true" after the question mark in your URI. You'll get the response and have several tabs to see more detailed information on your request. In tab Runtime you'll see execution times for your ABAP methods. If you need this information permanently for documention propose you can download it as an HTML file on disk. Just change the URI parameter to "sap-ds-debug=download".

Screen Shot 2013-06-21 at 09.25.52.png

          Example screenshot for URI parameter sap-ds-debug=true

 

If you get an error for your service call, you'll see an additional tab in your browser "Stacktrace".

 

stacktrace.png

URI paramert sap-ds-debug=true will also work in transaction /IWFND/GW_CLIENT.

 

 

Performance Trace in NW Gateway

 

SAP provides you also in NW Gateway an separate tool for analyzing ABAP method execution times. You can start this tool via transaction /IWFND/TRACES. The nice thing for this tool is, that you can easily configure the trace and click on your client through all pages of your UI5/native application. After that, look into traces transaction and see which service are called. And dive deeper in each service measurement.

 

To start a new measurement just follow these steps:

  • right click on "Users & Request URI Prefix", enter a user name _or_ an URI prefix
  • configuration tab: set performance trace to value "Active"
  • save configuration, this will active the trace for 2 hours
  • call your URI via client
  • refresh trace list

 

For further details, please look into the official documentation by SAP:

http://help.sap.com/saphelp_gateway20sp06/helpdata/en/9d/da3a2ceca344cf85568ae927e9858d/frameset.htm

Screen Shot 2013-06-21 at 09.24.58.png
          Example screenshot on performance trace in NW Gateway system (tx: /IWFND/TRACES)

 

 

 

Various tips

 

1) OData data types in URIs

 

Respect data types which need a identifier before actual content like Edm.Binary, Edm.DateTime, Edm.Guid, Edm.Time, Edm.DateTimeOffset.

For example when using a Edm.Guid type in a URI as a key, it has to be look like this: ServiceEndpoint(guid'xxxx-yyyy-...'). See all examples here:


http://www.odata.org/documentation/odata-v2-documentation/overview/#6_Primitive_Data_Types

 


2) Eclipse Gateway AddOn: services aren't visible


mark every entity set as addressable which are considered as an 'entry point' for your client. All entity sets which are accessable via URI navigation, shouldn't be directly addressable via a service URI to outside world (see comment by Ron Sargeant ).

 

 

3) POST & PUT payload for date properties

 

In my setup I always have to provide attribute 'm:null="true"' for date properties (OData type: Edm.DateTime) in payload. Even if my date property has flag "nullable" set in transaction SEGW I have to do it. Otherwise I received error: CX_SXML_PARSE_ERROR:Error while parsing an XML stream

 

So this tag doesn't work for me:  <d:StartDate></d:StartDate> or  <d:StartDate />

 

I have to set payload as following:  <d:StartDate m:null="true">


 

 

Performance

Always keep in mind performance with OData and especially mobile clients. There are so many aspects regarding performance that I'm thinking to write a separate blog about this topic.

 

  • a mobile device is limited to 5 HTTP request in parallel
  • application cache on mobile device (e.g. safari) is only in program memory and not in flash memory,. That's because it should not to destroy flash memory by huge amount of read/write requests >> better use web application cache (5MB available)
  • use expand query via OData to save requests on slow networks (3G), but respect that response is bigger

 

for more info please take a look here:

- http://www.igvita.com/slides/2013/breaking-1s-mobile-barrier.pdf

 

- blog about best practices on performance in Gateway with many Dos and Don'ts by David Freidlin

 

 

G+

SAP Netweaver 2.0 SP06 Install Parallels on OS X hints

$
0
0

I run a number of VMs using Parallels on my Mac and wanted to use the same for the available Netweaver Gateway 2.0 SP06 . There were a couple of issues after install that needed to be resolved before I could use the system.

 

VMList.png

 

The install was very quick and straightforward without any problems.

 

The first issue I hit was when starting using SAPMC . The user and hosts had been configured during the install, but the msg_server process had not started. Looking at the log the host nwgw was not reachable.

 

failedprocess.png

 

Checking the /etc/hosts file, no entry for nwgw existed. ifconfig -a yielded the IP address and the entry was added to the hosts file:

192.168.1.12      nwgw

 

With the hosts entry added Gateway started and the provided services could be accessed.

 

workingprocess.png

 

The next hurdle was to get my own services working. A model/service was created and configured.

 

serviceenable.png

 

model.png

This image combines both the Gateway server and IWBEP addon for service enablement, so transactions for both are available in the system.

 

/ifwnd/maint_service is the transaction for enabling the new services.

 

addservice.png

 

Selecting the service to be added caused a short dump:

 

sdump.png

Looking at the errors they we are related to the NONE destination connection call failing on an RFC. Digging around Note 1532825 the suggestion for RFC_PING to be used is alluded to. Simply using SE37 to RFC_PING NONE solved the problem.

 

rfcnone.png

 

With the NONE issue resolved it was now possible to create and run my own services:

 

gwtest.png

Unable to expand line items using $expand when used along with $filter

$
0
0

Dear Folks,

 

In my service, I am able to query header collection with a filter. (Delivery_header collection)

 

Now I want to expand (Delivery_line-items) for each delivery nos. I did the associations of the principal keys of delivery header collection with line item collection fields. I also provided the required navigation property.

 

I tried the following syntaxes but to no avail.

 

sap/opu/odata/sap/DELVRY_SRV/DELIVERY_HEADER?$filter=C_Tid eq 'S0000001284'&$expand=DELIVERY_ITEM,

 

sap/opu/odata/sap/DELVRY_SRV/DELIVERY_HEADER?$filter=C_Tid eq 'S0000001284'?$expand=DELIVERY_ITEM,

 

sap/opu/odata/sap/DELVRY_SRV/DELIVERY_HEADER?$filter=(C_Tid eq 'S0000001284')&$expand=DELIVERY_ITEM.

 

Please provide me some insight in this regard.

 

Thanks,

 

Kawish

Custom generation strategy – how to enhance the generated data provider base classes

$
0
0

This article explains how I became interested in the service generation code and the means to alter it, to suit my own requirements.

 

A bit of history…

 

I have identified certain generic ABAP-level operations that I wish to be able to re-use for any resource in services that I implement.

 

The originals were implemented as code-per-service, which led to a lot of cutting & pasting and duplication.

 

To reduce these overheads and enforce a design pattern, I introduced a custom super class for my DPC base class, i.e. subclassed /IWBEP/CL_MGW_PUSH_ABS_DATA to ZCL_MGW_PUSH_ABS_DATA.

 

By changing the inheritance of my DPC base, injecting the new intermediate class into the family, I could inherit any new attributes, types or methods I implemented in ZCL_MGW_PUSH_ABS_DATA.

 

This was a good solution with one drawback – if the model was regenerated in SEGW, the DPC class is also regenerated. That regeneration would remove my inheritance so I would have to reapply it manually. Not too bad, as I know about this step - but what about in future when this went into another developer's hands? I wanted a robust solution so I asked the Gateway product team if they could add this feature to Service Builder generation – otherwise I would need to implement a planned enhancement in the Service Builder code. They suggested I look at the Generation Strategy option and defined my own as a way of providing a solution.

 

“Great”, I replied, “where are the documentation and tools for this?”. 

 

Well, the documentation was a no show (perhaps it exists somewhere, in German?) but Martin Bachmann did point me to the configuration area. From there I was able to figure out how to resolve my planned enhancement into a configuration option instead (with some code still needed).

 

Since there is nothing out there about this at the time of writing I thought it would be a good experience to share.

 

 

Service Builder…is configurable!

Yes, that’s right! It’s actually possible to change some of the Service Builder functionality by altering the plug-in elements.


Word of warning: this is not an activity that should be undertaken by someone without advanced developer skills. What I am doing here is a fairly lightweight change, but it would be possible to radically alter (and damage) your service builder installation and the service generation without due care and attention.

 

 

A bit of experimentation…

 

If you are still reading, you obviously like to live dangerously.

 

To access the SEGW configuration, use transaction  /IWBEP/SBS .

 

This is a cross client configuration area, so you have no fail-safe in a backup client.

  • Make notes of all and any changes you make.
  • Log changes in a transport  (which may or may not need to go to your PRD environment).
  • Revert any changes that you make on a trial and error basis and do not have a need for – you may be messing up something that you aren’t using but some else is.

config.png

 

There are various nodes that can be changed but the one that jumps out at me is the ‘Generation Strategies’ node - nice and obvious.

However, it belongs in the ‘DM plugins’ node set – what is the parent I am needing to work with? I’m pretty sure it’s /IWBEP/GEN based on it’s description but I also check the content of the ‘Generation Strategies’ node.

 

I’ve noticed the generation strategy as an option when creating a service but it’s always one choice - ‘0001’, so I’ve never really worried about it. Now, that single option helps me to confirm that I have the right node, because it only contains ‘0001’. I could be wrong but decide to carry on and revert my work if I’m wrong.

gs1.png

 

First step is to add a new strategy. I don’t pick ‘0002’ as that might conflict with a new SAP strategy. So I opt for ‘9000’, in the way I might for a custom dynpro (ah good times).

gs2.png

 

 

Note: this only makes a choice of ‘9000’ possible, it doesn’t actually have any further settings in this view.

   

Only one small change, but I can now check that I am on the right track. I chose ‘Create project’ in SEGW and find I now have two options for generation strategy.

 

newproj.png

I can choose my strategy, now how can I apply it?

 

Looking back at the DM plug-in setup for /IWBEP/GEN, it nominates class /IWBEP/CL_SBGN_PLUGIN as a delegate, so I’ll look at that.

 

I find the class has only two locally implemented methods, it’s constructor and CREATE_GENERATION_STRATEGY. The latter sounds promising
and contains:

 

Untitled-1.png

 

Jackpot!

 

Strategy ‘0001’ creates an object to manage the generation and returns it. It seems logical that I can do the same with ‘9000’ and a different object in the return.

   

So, what will my replacement plug-in need to do when '9000' is chosen?

Looking at ‘0001’ logic, the object type returned is an instance of /IWBEP/CL_SB_GEN_GENERATOR.  I note that as another object that needs
changing.

    

Looking at  /IWBEP/CL_SB_GEN_GENERATOR, I find method GENERATE_DPC. So far, all the method names have been very helpful! However, that is now invoking a further class.

 

DATA lo_dpc_generator   TYPE REF TO /iwbep/cl_sb_gen_dpc_generator.

 

CREATE OBJECT lo_dpc_generator.
lo_dpc_generator
->/iwbep/if_sb_gen_dpc_generate~generate_dpc

 

Within this second  “DPC generator” class I find what I am looking for in method GENERATE_DPC_BASE_ABST_CLASS. Here is a hard-coded reference to the superclass name /IWBEP/CL_MGW_PUSH_ABS_DATA. That’s all I want to change.

 

Untitled-1.png

 

I now know that I have to alter three levels of the default generation strategy to make mine work.

 

Untitled-1.png

 

 

 

Back to the config of DM plugins: I opt to copy /IWBEP/CL_SBGN_PLUGIN and place the copy in the configuration (/IWBEP/CL_SBGN_PLUGIN is ‘final’ so it can only be copied – I would rather subclass it).  I also make a copy of the original /IWBEP/GEN, comment it as such and deactivate it.

 

Untitled-1.png

 

ZCL_SBGN_PLUGIN needs to return a custom generator object based on /IWBEP/CL_SB_GEN _GENERATOR when the strategy '9000' is chosen.

    

Now I have a problem; I need to use the same class basis as /IWBEP/CL_SB_GEN_GENERATOR  for the return parameter to work as the correct reference type, but inheritance doesn’t work as too many private methods are present, even though the class is not final.

My only option is to copy /IWBEP/CL_SB_GEN_GENERATOR into ZCL_SB_GEN_GENERATOR; I’ll change the GENERATE_DPC method later.

   

I add a new WHEN case for 9000 – ZCL_SBGN_PLUGIN now processes strategies 0001 and 9000.

 

when '9000'.
     
if iv_gen_strat_version is supplied.
        lv_gen_strat_version
= iv_gen_strat_version.
     
else.
        lv_gen_strat_version
= lc_gen_strat_version.
     
endif.

     
create object ro_gen_strategy
       
type
        zcl_sb_gen_generator   <=== custom type returned
       
exporting
          is_gen_strategy     
= is_gen_strategy
          iv_gen_strat_version
= lv_gen_strat_version.

 

 

 

 

Next, I look at what is needed for /IWBEP/CL_SB_GEN_DPC_GENERATOR and how I can alter method GENERATE_DPC_BASE_ABST_CLASS.

This class isn’t final but it is heavily privatised. I subclass it to ZCL_SB_GEN_DPC_GENERATOR but that gives me very little benefit and not a lot of security against version changes in the IWBEP component. All of the private methods are redefined and replaced with copies of the originals.

 

Since almost everything in this inherited class is a copy, I can change GENERATE_DPC_BASE_ABST_CLASS without too much concern.

 

Untitled-1.png

 

 

Addendum: I've since found that the regeneration of a "9000 service" dumps. The cause was hard to find but the fix was simple. The inheritance change is also referenced in the redefinitions preparation.

So this also needs changing to avoid such a dump (another ZCL_SB_GEN_DPC_GENERATOR method).

 

Untitled-1.png

 

Now that I have the final Z class built, I can return to ZCL_SB_GEN_GENERATOR and change its GENERATE_DPC method.

 

Untitled-1.png

 

 

 

That is all of the class building and config done, so I just need to verify that it all works.

Untitled-1.png

Class injection has worked.

 

Untitled-1.png

And my custom methods have all been inherited.

 

 

A few notes

 

  • This could also be used for changes to model provider generation - currently I don't have pressing need to change the MPC.
  • Once a generation strategy is set, you cannot change it unless you delete and rebuild. I think this limitation might need to be changed by SAP if we are to use this feature.
  • I could modify the generation code so that it composed the addtional methods in my base class or something similar, but I chose not to; this makes the maintenance simpler because it links to a non-composed class which can be tested and extended without touching the generation strategy code.
  • The other generation plugins can be adapted to suit.

 

 

I hope this gives the reader some ideas about unlocking some of the power of a configurable builder tool.

 

Regards

 

Ron.    


Improved Inside-Out Modelling

$
0
0

What is inside-out modelling?

It is a design paradigm that takes a business component and models it so that it can be exposed as a service. The entities and properties thereof are generally driven by the component interface. The most common form of inside-out driver is the RFC function module, although BOR objects and others like GENIL are available.

 

This is in contrast to outside-in model design , where the service that isrequired is modelled and the appropriate backend components are located or built to serve the consumption model.

The important phrase for me is “to serve the consumption model”; conversely, inside-out design can be paraphrased as “a service imposed by the business model”. Gateway consumers should not really be concerned with or know about the business model; it is generally a lot bigger and more complex than a particular service use case would need.

 

I am not a huge advocate of inside-out modelling but where it is deemed necessary, it can often be improved to work in a Gateway OData service context.

 

Let’s take the SAP expenses application & data concept as a starting point. “Expenses” is quite a complex beast; it has not been designed with modular access in mind, it is very monolithic in nature. Despite a revamped UI in Web Dynpro, it hasn’t really altered in the backend logic or model.

 

Rather than tackle the whole of the expenses component, I’m going to focus on one part – “trip options”. These are essentially the lists of options that you can choose for data entry while completing an expenses form. They are typically used to provide context and fill dropdowns or other value choice controls in a UI. What I found interesting about this part is that it mirrors certain aspects of the expenses component on a macro level.

 

If you wish to obtain the trip options, you can get the data from the BAPI_TRIP_GET_OPTIONS function.

 

This function returns 20 tables of various data! Here is a prime example of where inside-out design fails for me – how am I going to map 20 outputs to one entity? Typically one RFC is mapped to an entity to satisfy the READ operation.

 

At this point I would abandon any hope of providing a well-conceived service and look at the outside-in approach – but more on that later.

 

Back to the modelling exercise : if I have to do it this way, how do I do it well?

 

One BAPI, twenty collections

Do I need all 20 tables for my service? Those I don’t need to fetch I can remove from the plan.

 

For example, I’ll use these:

  • EMP_INFO
  • DEFAULTS
  • EXPENSE_TYPES
  • COUNTRIES
  • CURRENCIES

 

(In reality I’ll probably need more but 25% is a good cut for an example.)

 

Now, the approach that the majority of beginners in Gateway will take is to try and send all five of these tables back in one read operation, based on the assumption that simple request/responses can return multiple tables (like a BAPI!).

 

That is not the way to do it. You can’t do it. Fundamental OData is based on flat structures.

 

Here’s where the promised improvement starts…

 

What you do is create five entities with corresponding entitysets (build sets only if required; EMP_INFO for example has a cardinality of 1:1).

Each entity/collection is read by a separate GET request.

 

This has the following advantages.

 

  • The service entities are decoupled from the backend model; after all they are siblings, not part of a dependency set.
  • A collection can be read when it is required, rather than obtained as part of another package of collections. One, a few or all of the entities can be accessed as required without any request constraints.
  • More of the collections can be exposed if the service evolves, or conversely, deprecation is easier.

 

It does carry some disadvantages however;

  • The BAPI logic obtains all 20 outputs regardless of those that are required; despite the outputs being optional, pretty much all data is returned.
  • Using the BAPI still accesses 100% of the data when I only want 25%.
  • Each read of an entityset obtains the other four entitysets again!

 

In performance terms, where I only want to read five collections with one BAPI call, I am reading 100 (5 calls x 20 tables). Not great!

 

In this inside-out model, there has to be a balance between the unnecessarily complex and unclear means of implementing a method for obtaining the five collections and the enforced constraint of excessive access.

 

...And the answer is..?

 

Moving towards a solution, some of you may ask, ”If we read the EMP_INFO entity with this BAPI, it would have read the other four data tables for the other entities. Why don’t we just store these tables in memory then use them to fill the other sets when required?”

 

Indeed that would be a good idea; except that the request for EMP_INFO is stateless. If we store the other four tables, they are gone when we try a request for DEFAULTS as a second request, so the BAPI will have to go and read them again, plus the EMP_INFO we already have. And so on for the other collections.

 

Statefulness can be introduced fairly easily. It is possible to get five entitysets in one request with one BAPI call. The key to this is using the $expand option – when a request is “expanded” the required feeds for the expansion are evaluated in the same connection session, therefore the state is maintained for the duration of the request.

 

One drawback is that the client needs to know how to place the call in the right way to take advantage of this feature, however the feature is at least available!

 

The final model design

 

$expand is commonly used to obtain related entities, however it can be used to chain “unrelated” GET’s into one request as well.

 

In order to place an expand request, there has to be some relation – but there is no clear relation between the entities, they are not hierarchical.

 

I’ll now take a technique from the world of outside-in modelling in order to help me realise the model. If the entity design is coming from the outside, it does not have to have a direct (or any!) correlation to a data model on the inside. As long as I can devise a way to place meaningful data in the feed or response, that entity is valid.

 

What I need is a common relation for all of my five (or even the full 20) chosen entities. This is quite obvious – the entities are all BAPI outputs, so it follows that I should look at the input.

In order to provide those 20 outputs, all I require is an employee number. To properly qualify the context I should also add a date and language, which are optional inputs but can make a difference.

 

Based on this information, I design an entity called TripContext with properties for employee number, trip date and language – I also make sure they are all keys. 

 

Untitled-1.png

 

I can then provide an association from my TripContext to each entity and collection in the trip option service.

 

Because I am going to relate my collections to another entity, I do not pay any attention to the input parameters during the RFC wizard steps. Choosing and realising inputs is another feature of this process that creates a lot of confusion if the RFC is not “mapping friendly”.

 

I can create all five entities with one import.

 

Untitled-1 copy.pngUntitled-1.png

 

Ignore the inputs and choose the five collections from outputs.

Untitled-1 copy.pngUntitled-1 copy.png

 

 

Mark a key set within each entity block.

Untitled-1 copy.png

 

Returned entities need to be renamed (they are upper case and refer to multiples in some cases).

Untitled-1 copy.pngUntitled-1 copy.png

 

Create the sets.

Untitled-1 copy.png

 

Create associations from TripContext to each of the options collections.

Untitled-1 copy.png

 

Finally, assign navigation properties to TripContext.

Untitled-1 copy.png

 

With the right implementation, I can now obtain all of the five sets of options with one request.

 

TripContextSet(PersonID='00000005',Language='EN',TripDate=datetime'2013-10-01T00:00:00.0000000')?$expand=Defaults,TripEmployeeInfo,TripExpenseTypes,TripCountries,TripCurrencies

 

Explanations:

 

TripContextSet(PersonID='00000005',Language='EN',TripDate=datetime'2013-10-01T00:00:00.0000000')

 

The primary entity is the TripContext. The values that I use to GET the entity are actually used to establish the context, i.e. these values become known to the data provider as the initial phase of the request. The entity itself does not exist, it is a “state directive”; it is stating “this is the person in question and this is the date and language that may affect the outcome”. I do not need to access any further data in relation to this entity data, and the returned entity is the same as the request key.

 

The trick here is that I have now established the input for the following entities that I wish to obtain.

 

$expand=Defaults,TripEmployeeInfo,TripExpenseTypes,TripCountries,TripCurrencies

 

The expand option will locate each of the endpoints of the navigation properties that I defined. In the case of Defaults and TripEmployeeInfo, these are single entity feeds (cardinality 1:1) and the corresponding ‘GET_ENTITY’ method will be called. For the remainder, the corresponding ‘GET_ENTITYSET’ methods will be called.

 

 

Data Provider logic

 

I’ll make some assumptions for my DPC class.

 

  1. I’ll only ever want to access a single entity of type TripContext;  no collection logic is required.
  2. I’ll only ever want to access TripContext in order to provide the context for an expanded feed. The return form this request is pointless without an expand option.
  3. None of the trip options feeds will work unless the TripContext has been established.
  4. The more expanded elements per request, the more efficient the provider is; however the returned feeds must be required for consumption for this to hold true!

 

Based on the above assumptions, I can introduce some efficiency.

 

When TripContext is requested, what is really being requested are the trip options that fit into that context. At this point (TRIPCONTEXT_GET_ENTITY) it would be a good idea to call the BAPI, as we have the input values for it.

 

There is a slight problem here – the returned data from the BAPI isn’t required just now, the return feed is a TripContext. However I do know that the DPC is going to continue running during this RFC connection; the $expand option is going to invoke calls of the other GET methods.

I’ve got the entity data for those feeds already – so I’m going to store them.

 

In order to separate the concerns somewhat, I create an access layer class to manage the trip options.

The trip options manager object is then attached to my DPC as an attribute. It reads the BAPI and stores the results in its object space.

 

When I reach the second (implied by $expand) GET in the process, I can now ask the trip options manager to give me the return data.

I repeat this for each expected feed.

 

 

Improved?

 

I consider this a much better implementation of an inside-out design. A typical implementation of this service, solely based on the BAPI, would not have been very elegant, efficient or as simple to consume.

 

What could have been a very inefficient and cumbersome implementation is now well-scaled and fairly simple. In traced tests this service can return the full feed in under 100 milliseconds.

 

However, it still reads more data than required and could be written even more efficiently using the outside-in modelling approach. I intend to tackle this same scenario in an outside-in manner in a future blog.

 

Regards

 

Ron.

 

 


Detailed step by step procedure for Creating Gateway Service with all the CRUD Operations and testing them in Service Explorer Part1

$
0
0

In this blog I will explain Creating Gateway service with all CRUD operations  Before starting, I am expecting that you have basic idea of gateway service and gateway service builder i.e SEGW

 

1. Create service in Gateway system with all the CRUD operations.(Create Read Update Delete)


Create project in SEGW (Gateway Service Builder).

Our Project looks as below.

 

 

 

 

 

 

 

 

 

 

 

 



    


Now we need to build our Data model, for that first Create the Entity type by importing an RFC Interface.

Give the Entity type name, RFC destination and name of the RFC to be imported and click on next


Untitled.jpg

 

 

Select the required properties for the entity type.

Set key property for the entity type, here we will make Material as key then click on Finish.

Our Entity type MATR and its properties look as below.

 

Create Entity set with necessary entity type.

Give appropriate Entity set name and select the necessary Entity type then click on continue.

Our Project looks as below.

After creating Entity set Service implementation also created with empty CRUD operations which needed to be implemented by mapping with RFC/BOR interface.

Now we need save our changes within a request.

Now click on generate button to generate runtime artifacts.

Data provider and Model provider classes will be created. Click on ok.

Save the changes.

Runtime artifacts are generated and looks as below.

Now we need to implement all the CRUD operations in Service implementation.

Create

Read/Query

Update

Delete


CREATE: This is to create a new entry. Right click on create and choose Map to data source option.

Give the RFC destination name and select required RFC then click on ok.

 

Untitled.jpg

 

Now click on propose mapping, the system will do mapping automatically or else we can do it manually for required fields. Here no need to map all the properties of the entity type; we will map only the required properties. We will do the mapping by drag drop from Function Module onto data source parameter of the Service operation.

Here all the required fields will have input mapping as below.

Here in our case we need to pass constant values for BASE_UOM, MATL_GROUP and BASIC_VIEW, we will do that as follows. First append the rows and maintain values under constant value tab.

Once mapping is done save your changes. Now we will implement DELETE operation.

 

DELETE: Right click on Delete and choose Map to data source option.

Give the RFC destination name and select required RFC then click on ok. Here we are mapping with BAPI_MATERIAL_SAVEDATA.

Do the required mapping as follows. Here we are passing Material, DelFlag, MatlGroup fields so do the mapping accordingly.

READ: Right click on Read and choose Map to data source option.

 

Give the required RFC and do the mapping as below. Here we are mapping Material, MatlDesc, IndSector, MatlType, MatlGroup,BaseUom fields. Except Material all are having output mapping.

Note: In READ operation all key fields should have input mapping, here in our example Material is key.

 

 

QUERY: Right click on Query and choose Map to data source option.

Note: In Query operation all key fields should have output mapping. Here we can also map ranges directly to pass range value for selection.

 

Give the required RFC and do the mapping as below. Here we are mapping with BAPI_MATERIAL_GETLIST

Here we will map MatlDesc property with appropriate Function field and our key property Material with range table, for that just drag drop that filed on to the data source field. Click on ok in the below screen.

Our mapping looks like below.

Here for display purpose we will again add Material property since all key fields should have output mapping in Query operation.

For this click on create button and choose the Material property and do the necessary mapping as shown below.

 

UPDATE: Right click on Update and choose Map to data source option.

 

Give the required RFC, Here we are mapping with BAPI_MATERIAL_SAVEDATA.Do the required mapping as follows. Here we are going to update MatlGroup Field.

After completing all the implementations save the changes and check for syntax errors.

 

Now again click on Generate button to regenerate the runtime artifacts.

Check the success messages.

With this we have successfully created a gateway service with implementing all CRUD operations.

In the next part Detailed step by step procedure for Creating Gateway Service with all the CRUD Operations and testing them in Service Explorer Part2

we will maintain our service and test it in Service Explorer


 

 

Detailed step by step procedure for Creating Gateway Service with all the CRUD Operations and testing them in Service Explorer Part2

$
0
0

In the first part Detailed step by step procedure for Creating Gateway Service with all the CRUD Operations and testing them in Service Explorer Part1

I discussed the the below topic.

 

1.Create service in Gateway system with all the CRUD operations.(Create Read Update Delete).


Before starting, I am expecting that you are able to create gateway service or you have gone through my previous blog.In this blog i will discuss about maintaining our service and testing it in ServiceExplorer.

 

Our Service Name: ZBPS_MATR_DEMO_SRV.

Now we need to Activate and Maintain our service, we will do that in transaction /IWFND/MAINT_SERVICE.

     a. First we need to locate and add our service. Click on add service button.

Enter system alias and press enter. From the displayed list locate our service and click o it.

Save it in a package and click on Ok.

Select the request and click on Ok. Again click on ok. Come back and click on our service.

b. Now we need to add System alias to our service, i.e. systems with which we are going to interact.

Click on Create System alias button and enter the required system details and save the changes and come back.

Save the changes in the request.

Now we can see our system alias details.

Now we will explore our service for that click on Explore service button.

Click on Execute to get Service Document .

Service Document:  Describes the location and capabilities of one or more Collections.

Here we can check our Entity set name and we will get our Service URI.

 

Here our URI: http://<hostname>:<port>/sap/opu/odata/sap/ZBPS_MATR_DEMO_SRV/

Now come back and now choose Get Service Metadata option as shown below and click on execute to get the service mete data.

Now we will get our service Meta data.

Service Metadata Document (Metadata Document): Describes the data model (i.e. structure and organization of all the resources) exposed as HTTP endpoints by an OData service.

Note the URI of our service and we will use it further.

URI: http://<hostname>:<port>/sap/opu/odata/sap/ZBPS_MATR_DEMO_SRV/$metadata


Testing Our Service.


Now we will test our service in Gateway Client transaction for that is /IWFND/GW_CLIENT.

Paste our URI in Request URI field and click on Get HTTP Method.

 

Check the Service Meta data.

Now we will test all the CRUD operations.


READ:

Here in URI we can pass values for key fields only.

In our example we are trying to READ Material ‘000000000000000023’ and we will do that in Gateway client.

 

URI: http:// <hostname>:<port>/sap/opu/odata/sap/ZBPS_MATR_DEMO_SRV/matrset('000000000000000023')

 

Paste this URI against Request URI field in Gateway client and select HTTP Method GET and click on execute.

 

Output:

QUERY:

 

Query is to get multiple entries and here we can pass ranges for selection.

In our example we will try to fetch Materials within the range ‘000000000000000023’ to ‘000000000000000038’.

 

URI:

http:// <hostname>:<port>/sap/opu/odata/sap/ZBPS_MATR_DEMO_SRV/matrset?$filter=Material ge '000000000000000023' and Material le '000000000000000053'

 

Paste this URI @ Request URI in Gateway client and select HTTP Method GET and click on execute.

 

Output: We will get three materials in our output   

        Untitled.jpg      

 

 

CREATE:

 

Now we will try to create Material ‘000000000000000016’. For this first we will READ an existing material and using that XML we will CREATE desired material.

First try to READ Material ‘000000000000000023’ by using the URI same as in our READ operation in Gateway client.

 

Now click on Use as request button as below.

We will get same XML in HTTP Request Body. Make require changes in that for Material creation. Here we need to select HTTP Method POST for CREATE operation and we need to change URI as below.

 

URI:

http:// <hostname>:<port>/sap/opu/odata/sap/ZBPS_MATR_DEMO_SRV/matrset

OUTPUT:

UPDATE: Now we will try to UPDATE the Material just we have created through CREATE operation.

 

Do the same steps as in create operation, first read the Material ‘000000000000000016’ through READ operation and by using that xml as a Request we will try to UPDATE Material Group of that Material.

 

URI: http:// <hostname>:<port>/sap/opu/odata/sap/ZBPS_MATR_DEMO_SRV/matrset('000000000000000016')

 

Here MatlGroup is 00108, we will update that to 00107 and we will verify it through READ operation.

Click on Use as Request and make the necessary changes to the XML as below and select the HTTP Method PUT for Updating and click on execute.

Xml:

 

<?xml version="1.0" encoding="utf-8"?>
<entry xml:base=
"http://<hostname>:<port>/sap/opu/odata/sap/ZBPS_MATR_DEMO_SRV/" xmlns="http://www.w3.org/2005/Atom" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices">
<content type=
"application/xml">
<m:properties>
<d:BaseUom>EA</d:BaseUom>
<d:MatlGroup>00107</d:MatlGroup>
<d:BasicView>X</d:BasicView>
<d:MatlType>ROH</d:MatlType>
<d:IndSector>1</d:IndSector>
<d:Material>000000000000000016</d:Material>
<d:MatlDesc>Test material gateway</d:MatlDesc>
<d:LanguIso>EN</d:LanguIso>
<d:Langu>E</d:Langu>
</m:properties>
</content>
</entry>

 

OUTPUT:

Now we will check that whether that Material is updated or not by using READ operation. Select the HTTP Method GET and check the output.

DELETE:

Now we will try to DELETE the Material which we have created, for this we use HTTP method DELETE.

Here to delete the Material we will just set the Delete Flag to ‘X’. First we will read that material and using that xml as a request we will process Delete operation. Here we will select HTTP method DELETE and execute it after modifying xml.

 

XML:

<?xml version="1.0" encoding="utf-8"?>
<entry xml:base=
"http://<hostname>:<port>/sap/opu/odata/sap/ZBPS_MATR_DEMO_SRV/" xmlns="http://www.w3.org/2005/Atom" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices">
<content type=
"application/xml">
<m:properties>
<d:MatlGroup>00107</d:MatlGroup>
<d:Material>000000000000000019</d:Material>
<d:DelFlag>X</d:DelFlag>
</m:properties>
</content>
</entry>

OUTPUT:

With this we have completed all the CRUD operations.

 

SAP (Netweaver) Gateway 7.3.1 - Obsolete Runtime Artifacts can not be deleted

$
0
0

I found out (and I'm not the only one), that if the gateway creates an interface, which shows up in SEGW under Runtime Artifacts and you delete it in the class builder (right-mouse click  via 'Go to ABAP Workbench'  from SEGW), it STILL shows up under Runtime Artifacts and you can't get rid of it.

 

blog1.JPG

 

That's not good because it can pile up junk during project development.

 

Here now, until this bug is fixed, a brute-force work around for all sufferers:

 

The objects are located in database table  /IWBEP/I_SBD_GA

 

blog2.jpg

I deleted the entry (I know, it's not elegant, but it works for now) and SEGW is clean again:

 

blog3.jpg

And the best thing, if you do this in the development system, the changes will be effective in target systems after transport.

 

Nevertheless, SAP plz fix this bug

Vesna: Spring framework functionality for ABAP OO

$
0
0

Yes that's not the April Fools' Day and yes, if you were missing Spring Framework in ABAP OO, your longing is over:

Vesna Framework Add-On 100_702 has been released and it provides Spring look and feel in ABAP OO environment, enriched with some SAP NetWeaver-specific features.

Vesna is not a literal port of Spring Framework, far more a "spiritual" one since it brings functionality of Spring to ABAP OO world but the technology behind it has much more in common with .NET Castle framework, implemented in ABAP OO-specific manner.

First productive use case of Vesna refers to coolOrange PowerGate, being a brilliant, OData-based integration platform for Autodesk and SAP environments. Come and visit us at Autodesk AU in Darmstadt 23-24.10 to see PowerGate and Vesna in action!

Right now you can watch a webcast showing an example use case of for Vesna: one of SAP NetWeaver Gateway services from PowerGate stables.

This specific webcast concentrates on application composition with Vesna. However, current version of the Vesna Framework (100_702) includes also AOP support. And yes, AOP stands here for nothing short of Aspect Oriented Programming! Quite soon I hope to be able to show you a more involving demo of Vesna AOP as well.

If the webcast captured your attention, wait no longer: go and grab a copy for yourself off the Arcona Labs S.A. web presence. Vesna Framework is free for research and non-commercial use and a full commercial license incl. lifetime upgrades costs as little as 1000 EUR one time fee.

Viewing all 49 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>