Quantcast
Channel: SAP NetWeaver Gateway Developer Center
Viewing all 49 articles
Browse latest View live

Multi-deep insert in SAPGateway(SEGW Approach)

$
0
0

Hi everyone,

 

Below example shows how to pass multiple nested tables as input to odata/gateway service using SEGW approach.

 

We have  6 tables.


1.      Header (Fields: ID, Name, Message_text)

    • Item1 (Fields: ID, Name1, Address)
      • Item 1.1 (Fields: ID, Name, State, Country)
      • Item 1.2 (Fields: Fname, Mname, LName)
    • Item2 (Fields: ID, Name, City)
      • Item2.1 (Fields: ID,Name,Pincode,Street)


Item1 and item2 are inline to Header table.

Item1.1 and item 1.2 are inline to item1 table.

Item 2.1 is inline to item 2 table

 

In multi deep insert, we should be able to pass multiple line items in item1 , item 1.1 , item 1.2, item 2 and  item2.1 along with the data in header structure.

 

  1. e.g.

Input to the service:

   

oo.png


Output from the service:


Input.png


Value of Message_text in header structure is changed to
Test1.


Steps to achieve the above scenario:

 

Step1. Create 6 entity types Header, Item1 and item2,Item1_1,Item1_2,Item2_1.

                  

Entitytypes.png      entity_types_2.png

 

Properties are:

 

Header:

header.png

 

Item1:

 

item1.png

 

Item1_1:

 

item1_1.png

 

Item1_2:

 

item1_2.png

 

Item2:

 

item2.png

 

Item2_1:

 

item2_1.png

*************************************************************************************************************************************************************

Step2. Create entity sets HeaderSet, item1Set, item2Set,item1_1Set,item1_2Set,item2_1Set.

 

 

entity_sets.png

*************************************************************************************************************************************************************

Step3: Create Associations between entity types:


1. Header – item1

2. Header – item2

3. Item1--Item1.1

4. Item1--item1.2

5. Item2--Item2.1

Associations.png

      

Navigation properties get created automatically under entity type: header , Item1 and Item2

 

navigation_header.png  navigation_item1.png      navigation_item2.png

*************************************************************************************************************************************************************

Step4: Save and Check for any error.

 

Step5: Generate Runtime classes and Services

 

Step6: Register Service in gateway hub.

 

Step7: Go to DPC_Ext class in ABAP Workbench.

 

Step8: Redefine method /IWBEP/IF_MGW_APPL_SRV_RUNTIME~GET_EXPANDED_ENTITYSET –

 

This method is redefined to get Request payload of Multi deep insert.

 

 

Step9: Inside GET_EXPANDED_ENTITYSET,

 

code1.png

 

ITEM1TOITEM1_1 and ITEM1TOITEM1_2 are names of navigation properties for table item1


ITEM2TOITEM2_1 IS navigation property for table item2


HEADERITEM1 and HEADERITEM2 are navigation properties for table header

 

 

Insert data in the Structures.

 

Code2.png

 

Add 3 entries in table et_expanded_tech_clause to specify to framework that there are  inline entities to the base entity(values to be inserted are Navigation properties).(Absence of this code will hit the get_expanded_entityset method 6 times for all the entities created.


Insert data in the respective structures and tables(header,item1 and item2,item1.1,item1.2,item2.1) and Append that data in the final


Copy_data_to_ref.png

Save and Activate.

 

Step 10: Form the URL.

 

URL.PNG

 

HTTP method : GET

 

Execute (F8).

 

******************************************************************************************************************************************************

Step11: Use the response from the above request to create request payload of Multi Deep Insert.

 

(Remove feed tag below first line and end feed tag)

 

 

 

 

 

Step12: Once Request payload is generated, redefine method /IWBEP/IF_MGW_APPL_SRV_RUNTIME~CREATE_DEEP_ENTITY

 

Inside /IWBEP/IF_MGW_APPL_SRV_RUNTIME~CREATE_DEEP_ENTITY,

 

Declare the output structure.

 

output_structure.png

 

Read the data sent by the user through request payload into structure ls_headeritem.

 

ls_headeritem.png

 

Change the message_text of Header structure .

 

ls_headeritem-message_text = 'Test1'.

 

Call Copy_data_to_ref and pass ls_header_item as structure.

 

Step 13: Form the URL.

 

URL:

 

/sap/opu/odata/sap/ZGW_CREATEDEEP_66883_SRV/headerSet


Method: POST.

 

Execute (F8).

 

This way we can implement such scenarios.

 

 

Thanks,

 

Anjor Wagle.

 

 

 

 






Custom header parameters in Gateway

$
0
0

I started off with GW learning as part of my new UI5 project .As you all know Gateway is a framework to expose the rich backend   SAP data  in the form of ODATA protocol.I am not going to give so many motherhood statements as it is well documented elsewhere

 

Now .. The Problem Statement

During so many scenarios, I came across a common requirement to pass some HTTP request header parameters .

To tell one sample scenario, there is an entity type Exemption .This entity type Exemption can be 'Requested'   or 'Approved' from a business perspective from a UI application . The 'Request' or 'Approve' action can be done in batch for the Exemption record .If we model this as a property(I will call it the BusinessAction) in entity type Exemption, the problem is that the payload will have to repeat this property for all the records that it is sending. . All the more from a modelling perspective it does look silly to have BusinessAction in that entity type

 

Then I came across a blog http://scn.sap.com/community/developer-center/netweaver-gateway/blog/2013/06/21/odata-custom-parameters--gateway-can-have-them-too from Ron Sargeant. This involved changes to both the GW Hub system layer and the backend BEP layer. In our case we did not have developer authorisation in GW hub system . The GW hub system is just used for registration of backend service..

 

So whats the option we have ??

images_confused.jpg

 

Welcome Mr Trial & Error ...

I just tried a sample scenario where in I send a sample HTTP request header parameter and see how does it reach our Broker GW and then ultimately to Uncle backends  home..

 

I tired to send a Custom parameter via request header parameter for a sample Entity type.Now I put the break point in the our DPC class corresponding entity GET_ENTITYSET.The values

 

Custom_parameter.png

This part of debugging is called 'Happy Debugging'

.I was happy to see the  values appear in the importing parameter io_tech_request_context


When I checked in debugger the io_tech_request_context is of type /IWBEP/CL_MGW_REQUEST with protected attribute MR_REQUEST


Tech_req_context.jpg

The attribute MR_REQUEST has deep structure TECHNICAL_REQUEST which has attributes like below and has a attribute REQUEST_HEADER

tech_req_header.jpg

When we went inside we have the request header parameter that  we set on the journey...

Custom_parameter_debugger.png

Now I thought its just a matter of reading the values .Its matter of reading values from io_tech_request_context->mr_request->request_header...(So simple)

 

This part of debugging is called 'Not so happy Debugging'

I realised the attribute MR_REQUEST is a protected attribute and cannot be accessed by simple assignment to local variable.

 

So Whats the workaround?

 

I made a subclass of the standard class /IWBEP/CL_MGW_REQUEST..ZCLGW_REQUEST.jpg

I  wrote a method GET_REQUEST_HEADER_VALUE where in takes the request context object, header parameter and returns back the value

Method_signatures.jpg

The code inside the method for reading the  header parameter is as below

 

data:lo_request  type ref to /IWBEP/IF_MGW_CORE_SRV_RUNTIME=>TY_S_MGW_REQUEST_CONTEXT,

         lt_request_headers TYPE tihttpnvp,

         ls_request_header like LINE OF lt_request_headers.

 

     lo_request = io_tech_request_context->mr_request.

     lt_request_headers = lo_request->technical_request-request_header.

 

     READ TABLE lt_request_headers into ls_request_header with key name = iv_request_parameter.

     if sy-subrc eq 0.

       rv_request_value = ls_request_header-value .

     endif.

 

Reading the value

 

Now I just need to use the utility method  GET_REQUEST_HEADER_VALUE of subclass to get the custom parameter value.

 

Use the following code in your CRUDQ method to read the request header

 

DATA:     lo_tech_request     TYPE REF TO /iwbep/cl_mgw_request,

               lo_tech_request_sub TYPE REF TO zcl_gw_mgw_request,

               lt_headers          TYPE tihttpnvp,

               lv_request_header type string,

               lv_request_header_value type string.

   

**Casting and Assigning  importing parameter to local variable

  lo_tech_request ?= io_tech_request_context.


***Instantialte subclass with some dumy header parameters

CREATE OBJECT lo_tech_request_sub

       EXPORTING

         it_headers = lt_headers..


lv_request_header_value  = lo_tech_request_sub->get_request_header_value( EXPORTING io_tech_request_context = lo_tech_request

                                                                                                               iv_request_parameter = 'customparameter).

 

Testing the code -Final showdown..

 

Now in GW client in the GW hub system I run the below URL with HTTP header request custom parameters

Final_test_gw_client.jpg

Now in the debugger of GET_ENTITYSET...

 

Final_result.jpg

File Upload/Download in CRM WebUI Using Net weaver Gateway/OData Services

$
0
0

1). Step-by-Step to upload the file attachment in CRM WebUI using Net Weaver Gateway.


Create the project in SEGW Transaction Code and the Entity Type:

In the Entity Type Properties select the check box: Media

attach1.png

And the properties of Entity Type are:

attach2.png

And then map the RFC function module for the Create Operation in the Entity Set.

attach3.png

And do the mapping for Get Entity (Read) Operation in the Entity Set.

attach4.png

Then Redefine the DEFINE method in the *MPC_EXT class and add the below logic:

 

  METHOD define.
super
->define( ).
DATA: lo_entity   TYPE REF TO /iwbep/if_mgw_odata_entity_typ,
       lo_property
TYPE REF TO /iwbep/if_mgw_odata_property.


lo_entity
= model->get_entity_type ( iv_entity_name = 'TerritoryFileAttachment' ).

 
IF lo_entity IS BOUND.
   lo_property
= lo_entity->get_property( iv_property_name = 'MIME_TYPE' ).
  
IF lo_property IS BOUND.
     lo_property
->set_as_content_type( ).
  
ENDIF.
ENDIF.

ENDMETHOD.


Then Redefine the CREATE_STREAM Method (/IWBEP/IF_MGW_APPL_SRV_RUNTIME~CREATE_STREAM) in the *DPC_EXT class and implement the below logic to upload the file attachment into the CRM WebUI for a given Territory Plan.

 

All input parameters/values we have to get into the SLUG parameter from the UI Side (If we have multiple input parameter values then with concatenation of multiple parameter values with delimiter we have to get the values in SLUG parameter).

 

  METHOD /iwbep/if_mgw_appl_srv_runtime~create_stream.

DATAls_file_attach               TYPE        ztp_s_file_attachment,
       lv_tp_guid                  
TYPE        crm_mktpl_ib_mpl_guid,
       ls_key                      
TYPE        /iwbep/s_mgw_tech_pair,
       lt_keys                     
TYPE        /iwbep/t_mgw_tech_pairs,
       lv_entityset_name           
TYPE        string,
       lv_entity_name              
TYPE        string,
       lo_tech_read_request_context
TYPE REF TO /iwbep/cl_sb_gen_read_aftr_crt,
       ls_entity                   
TYPE REF TO data,
       ls_string                   
TYPE        string.

DATAls_bo                  TYPE        sibflporb,
       lt_properties         
TYPE        sdokproptys,
       ls_properties         
TYPE        sdokpropty,
       lt_file_access        
TYPE        sdokfilacis,
       ls_file_access        
TYPE        sdokfilaci,
       lt_file_content_binary
TYPE        sdokcntbins,
       ls_loio               
TYPE        skwf_io,
       ls_phio               
TYPE        skwf_io,
       ls_error              
TYPE        skwf_error,
       lv_file_size          
TYPE        i,
       lt_messages           
TYPE        zif_zdmtp_service=>bapiret2_t,
       ls_messages           
TYPE        bapiret2,
       lo_dp_facade          
TYPE REF TO /iwbep/if_mgw_dp_facade,
       lv_destination        
TYPE        rfcdest,
       lr_dmtp_service       
TYPE REF TO zcl_dmtp_service,
       lv_tp_id              
TYPE        crm_mktpl_campaignid.


FIELD-SYMBOLS: <ls_data> TYPE any.

CLEAR: ls_file_attach, lv_tp_guid, ls_bo, lt_properties, ls_properties,
        lt_file_access
, ls_file_access, lt_file_content_binary,ls_loio,
        ls_phio
, ls_error, lv_file_size.

***IV_SLUG parameter will be passed from the front-end side
SPLIT iv_slug AT '/' INTO ls_file_attach-tp_id
                       ls_file_attach
-filename
                       ls_file_attach
-name
                       ls_file_attach
-description.

****File Type(MIME TYPE)****
ls_file_attach
-mime_type  = is_media_resource-mime_type.


****File Content in XSTRING.*****
ls_file_attach
-file_value = is_media_resource-value.

****Convert the Territory Plan ID into GUID****
CALL FUNCTION 'CONVERSION_EXIT_CGPLP_INPUT'
EXPORTING
   
input  = ls_file_attach-tp_id
IMPORTING
   
output = lv_tp_guid.

****Build Attachment Business Object****
ls_bo
-catid  = 'BO'.
ls_bo
-typeid = 'BUS2010010' .
ls_bo
-instid = lv_tp_guid.

****Build Attachment Properties****
ls_properties
-name  = 'KW_RELATIVE_URL'. "NAME
ls_properties
-value = ls_file_attach-name.
APPEND ls_properties TO lt_properties.
CLEAR ls_properties.

ls_properties
-name  = 'DESCRIPTION'. "DESCRIPTION
ls_properties
-value = ls_file_attach-description.
APPEND ls_properties TO lt_properties.
CLEAR ls_properties.

ls_properties
-name  = 'MIMETYPE'. "MIME TYPE
ls_properties
-value = ls_file_attach-mime_type.
APPEND ls_properties TO lt_properties.
CLEAR ls_properties.

****Convert the Attachment File Data from XSTRING to BINARY****
CALL FUNCTION 'SCMS_XSTRING_TO_BINARY'
EXPORTING
  
buffer                    = ls_file_attach-file_value
IMPORTING
   output_length
= lv_file_size
TABLES
   binary_tab   
= lt_file_content_binary.

****Build File Access Information****
ls_file_access
-file_size  = lv_file_size.
ls_file_access
-binary_flg = abap_true.
ls_file_access
-file_name  = ls_file_attach-filename.
ls_file_access
-mimetype   = ls_file_attach-mime_type.
APPEND ls_file_access TO lt_file_access.
CLEAR ls_file_access.

****Upload the Attachment for Territory Plan in CRM WEBUI****
CALL METHOD cl_crm_documents=>create_with_table
EXPORTING
   business_object    
= ls_bo
   properties         
= lt_properties
   file_access_info   
= lt_file_access
   file_content_binary
= lt_file_content_binary
IMPORTING
   loio               
= ls_loio
   phio               
= ls_phio
   error              
= ls_error.


IF ls_error IS INITIAL.
  ls_file_attach
-file_loio_guid = ls_loio-objid.
ELSE.
  ls_messages
-id                   = ls_error-id.
  ls_messages
-number           = ls_error-no.
  ls_messages
-type                = ls_error-type .
  ls_messages
-message_v1 = ls_error-v1 .
  ls_messages
-message_v2 = ls_error-v2 .
  ls_messages
-message_v3 = ls_error-v3 .
  ls_messages
-message_v4 = ls_error-v4 .
 
APPEND ls_messages TO lt_messages.

me
->/iwbep/if_sb_dpc_comm_services~rfc_save_log(
EXPORTING
   iv_entity_type
= iv_entity_name
   it_return     
= lt_messages
   it_key_tab    
= it_key_tab ).

****Call RFC commit work****
me
->/iwbep/if_sb_dpc_comm_services~commit_work(
EXPORTING
iv_rfc_dest
= lv_destination) .


RETURN.
ENDIF.


*-------------------------------------------------------------------------*
*             -****Read After Create -******
*-------------------------------------------------------------------------*
CREATE OBJECT lo_tech_read_request_context.
* Create key table for the read operation
ls_key
-name  = 'TP_ID'.
ls_key
-value    = ls_file_attach-tp_id.
APPEND ls_key TO lt_keys.

ls_key
-name  = 'FILENAME'.
ls_key
-value    = ls_file_attach-filename.
APPEND ls_key TO lt_keys.

ls_key
-name  = 'IV_OBJECT'.
ls_key
-value    = 'FILEATTACH'.
APPEND ls_key TO lt_keys.

ls_key
-name  = 'FILE_LOIO_GUID'.
ls_key
-value    = ls_file_attach-file_loio_guid.
APPEND ls_key TO lt_keys.

****Set into request context object the key table and the entity set name****
lo_tech_read_request_context
->set_keys( IMPORTING et_keys = lt_keys ).

lv_entityset_name
= io_tech_request_context->get_entity_set_name( ).

lo_tech_read_request_context
->set_entityset_name( IMPORTING ev_entityset_name = lv_entityset_name ).


****Call read after create****
/iwbep/if_mgw_appl_srv_runtime
~get_entity(
EXPORTING
   iv_entity_name         
= iv_entity_name
   iv_entity_set_name     
= iv_entity_set_name
   iv_source_name         
= iv_source_name
   it_key_tab             
= it_key_tab
   io_tech_request_context
= lo_tech_read_request_context
   it_navigation_path     
= it_navigation_path
IMPORTING
   er_entity              
= ls_entity ).

****Send the read response to the caller interface****
ASSIGN ls_entity->* TO <ls_data>.

IF <ls_data> IS ASSIGNED.
   copy_data_to_ref
(

        EXPORTING

            is_data = <ls_data>
   
CHANGING 

            cr_data = er_entity ).
ENDIF.
ENDIF.
ENDMETHOD.


Once the CREATE_STREAM method redefines is done then we can test the service using the REST Client to upload the file attachment into CRM WebUI for Territory Plan.

 

Please Note: Maximum file size 30 MB will allow to upload the file from CRM WebUI standard functionality.

 

Test the service using the Rest-client.

 

First get the CSRF-Token value while calling below service.

 

attach5.png

Then in the response we will get the CSRF Token Value.

 

attach6.png

Then give the CSRF Token value and SLUG parameter in the Headers and choose the file to upload it via OData Service. Once we click on send then file will be uploaded in CRM WebUI for a given Territory Plan.

 

attach7.png

 

2). Step-by-Step to Read/Download the file attachment from CRM WebUI using Net Weaver Gateway.


Continuation with Step1, Redefine the GET_STREAM Method (/IWBEP/IF_MGW_APPL_SRV_RUNTIME~GET_STREAM) in the *DPC_EXT class and implement the below logic to read/download the file attachment from the CRM WebUI for a given Territory Plan.

 

  METHOD /iwbep/if_mgw_appl_srv_runtime~get_stream.
DATAls_key_tab         TYPE LINE OF /iwbep/t_mgw_name_value_pair,
       ls_stream         
TYPE              ty_s_media_resource,
       is_file_attachment
  TYPE              ztp_s_file_attachment,
       es_file_attach    
TYPE              ztp_s_file_attachment,
       lv_media_value    
TYPE               xstringval,
       lv_mime_type      
TYPE               string,
       lo_data           
TYPE REF TO  zcl_dmtp_service,
       ls_header         
TYPE               ihttpnvp.

****Read the Key Field Values****
CLEAR: is_file_attachment, ls_key_tab.
READ TABLE it_key_tab INTO ls_key_tab WITH KEY name = 'TP_ID'.
IF sy-subrc = 0.
  is_file_attachment
-tp_id = ls_key_tab-value.
ENDIF.

CLEAR ls_key_tab.
READ TABLE it_key_tab INTO ls_key_tab WITH KEY name = 'FILENAME'.
IF sy-subrc = 0.
  is_file_attachment
-filename = ls_key_tab-value.
ENDIF.

CLEAR ls_key_tab.
READ TABLE it_key_tab INTO ls_key_tab WITH KEY name = 'IV_OBJECT'.
IF sy-subrc = 0.
  is_file_attachment
-iv_object = ls_key_tab-value.
ENDIF.

CLEAR ls_key_tab.
READ TABLE it_key_tab INTO ls_key_tab WITH KEY name = 'FILE_LOIO_GUID'.
IF sy-subrc = 0.
  is_file_attachment
-file_loio_guid = ls_key_tab-value.
ENDIF.

****Read File Attachment for Territory Plan****

    DATA:ls_loio                TYPE skwf_io,
       lt_loio               
TYPE skwf_ios,
       lt_properties_result  
TYPE crm_kw_propst,
       ls_properties_result  
TYPE crm_kw_props,
       ls_error              
TYPE skwf_error,
       lt_file_content_ascii 
TYPE sdokcntascs,
       lt_file_content_binary
TYPE sdokcntbins,
       lt_file_access        
TYPE sdokfilacis,
       ls_file_access        
TYPE sdokfilaci,
       iv_length             
TYPE i,
       ls_bo                 
TYPE sibflporb,
       lv_tp_guid            
TYPE crm_mktpl_ib_mpl_guid,
       ls_doc_property       
TYPE sdokproptl.

CLEAR: ls_loio, lt_file_access, lt_file_content_ascii, lt_file_content_binary,
        ls_error
, iv_length, ls_bo, lv_tp_guid, lt_properties_result, lt_loio,
        ls_properties_result
, ls_doc_property.

****Convert the Territory Plan ID into GUID****
CALL FUNCTION 'CONVERSION_EXIT_CGPLP_INPUT'
EXPORTING
  
input  = is_file_attachment-tp_id
IMPORTING
  
output = lv_tp_guid.

****Build Attachment Business Object****
ls_bo
-catid  = 'BO'.
ls_bo
-typeid = 'BUS2010010' .
ls_bo
-instid = lv_tp_guid.

****Get the Attachment Properties information****
CALL METHOD cl_crm_documents=>get_info
EXPORTING
   business_object      
= ls_bo
IMPORTING
   ios_properties_result
= lt_properties_result
   loios                
= lt_loio.

****Read the File Attachment LOIO GUID and Classs and Object type****
READ TABLE lt_loio INTO ls_loio WITH KEY objid = is_file_attachment-file_loio_guid.
 
IF sy-subrc EQ 0.
******Get the File Attachment Data in Binary******
  
CALL METHOD cl_crm_documents=>get_with_table
  
EXPORTING
     loio               
= ls_loio
  
IMPORTING
     file_access_info   
= lt_file_access
     file_content_ascii 
= lt_file_content_ascii
     file_content_binary
= lt_file_content_binary
     error              
= ls_error.

IF ls_error IS INITIAL.
   es_file_attachment
-iv_object       = is_file_attachment-iv_object.
   es_file_attachment
-tp_id           = is_file_attachment-tp_id.
   es_file_attachment
-filename        = is_file_attachment-filename.
   es_file_attachment
-file_loio_guid  = is_file_attachment-file_loio_guid.


********Read File MIME TYPE and File Size********
READ TABLE lt_file_access INTO ls_file_access INDEX 1.
IF sy-subrc EQ 0.
  es_file_attachment
-mime_type ls_file_access-mimetype.
  iv_length                   
= ls_file_access-file_size.
ENDIF.

********Read Attachment NAME and DESCRIPTION values********
READ TABLE lt_properties_result INTO ls_properties_result WITH KEY objtype = ls_loio-objtype
                                                                  
class         = ls_loio-class
                                                                   objid  
= ls_loio-objid.
IF sy-subrc EQ 0.


**********Read NAME**********
CLEAR ls_doc_property.
READ TABLE ls_properties_result-properties INTO ls_doc_property WITH KEY name = 'KW_RELATIVE_URL'.
IF sy-subrc EQ 0.
  es_file_attachment
-name = ls_doc_property-value.
ENDIF.


**********Read DESCRIPTION***********
CLEAR ls_doc_property.
READ TABLE ls_properties_result-properties INTO ls_doc_property WITH KEY name = 'DESCRIPTION'.
IF sy-subrc EQ 0.
  es_file_attachment
-description = ls_doc_property-value.
ENDIF.


ENDIF.

IF lt_file_content_binary IS INITIAL.

******If file attachment format is .TXT then Convert ASCII to BINARY*****
CALL FUNCTION 'SCMS_TEXT_TO_BINARY'
IMPORTING
   output_length
= iv_length
TABLES
   text_tab     
= lt_file_content_ascii
   binary_tab   
= lt_file_content_binary
EXCEPTIONS
   failed       
= 1
  
OTHERS              = 2.


  
IF sy-subrc <> 0.
 
**Implement suitable error handling here
  
ENDIF.

ENDIF.


******Convert Binary Data to XSTRING********
CALL FUNCTION 'SCMS_BINARY_TO_XSTRING'
EXPORTING
   input_length
= iv_length
IMPORTING
  
buffer                  = es_file_attachment-file_value
TABLES
   binary_tab  
= lt_file_content_binary
EXCEPTIONS
   failed      
= 1
  
OTHERS            = 2.


  
IF sy-subrc <> 0.
***Implement suitable error handling here
  
ENDIF.

  ENDIF.

ENDIF.
************

IF es_file_attachment IS NOT INITIAL.
  
******Move the File Type(MIME TYPE) value to the final work area********
   ls_stream
-mime_type = es_file_attachment-mime_type.


  
******Move the File Content in XSTRING to the final work area**********
   ls_stream
-value     = es_file_attachment-file_value.

  
*******Fill the File Header Information to display the actual file name while downloading the file attachment********
   ls_header
-name = 'Content-Disposition'.
   ls_header
-value  = 'inline; filename="'.

  
CONCATENATE ls_header-value  ls_file_attach-filename '"' INTO ls_header-value.

   set_header
( is_header = ls_header ).
ENDIF.

CALL METHOD me->copy_data_to_ref
EXPORTING
   is_data
= ls_stream
CHANGING
   cr_data
= er_stream.

ENDMETHOD.


Once the GET_STREAM method redefines is done then we can test the service using the REST Client to Read/Download the file attachment from CRM WebUI for Territory Plan using OData Service.

 

To test the service for Reading/Downloading the corresponding file from CRM WebUI call the service in chrome Browser with all key field parameter values with $value.

 

https://<Host:Port>/sap/opu/odata/sap/ZSN_DM_TP_SRV/TerritoryFileAttachmentSet(TP_ID='CRM-XN14-CRM-1292',FILENAME='roadmap.docx',IV_OBJECT='',FILE_LOIO_GUID='005056A501651ED48FB13A5BB66964C9')/$value

 

attach8.png

 

 


Gateway Service: Optimizing Performance with SQL Paging

$
0
0

I did some Performance Analysis of Gateway Services. My Focus in this BLOG is on GET_ENTITYSET Methods with a High Number of Database Entries selected, for instance a lot of salesorders etc. where it is  for performance reasons not possible to load all data in the initial request to the client/Browser and avoid further calls of GET Method, so further data is loaded when the users scrolls down or change filters. because the service is stateless, i've found a lot of expensive sql statements in the trace.

as of Gateway 2.0 there will be a 'limited stateful' mode called soft-state, see link: http://scn.sap.com/docs/DOC-58760

but stateful gives other issues (ressources/memory), and REST was designed stateless (see note 1986626)

 

when looking at some standard fiori gateway services like salesorder, this was done like this:

 

CL_LORD_MY_QUOTATION_DPC_EXT

QUOTATIONSET_GET_ENTITYSET

 

  METHOD quotationset_get_entityset.
...
"initialize paging, if top is not provided 0 is passed.
"no need to check skip,top. They are expected to be numbers. ODATA Gateway checks and provides us
"numeric data
IF is_paging-top IS NOT INITIAL.
lv_max
= is_paging-skip + is_paging-top.
ENDIF.

...

SELECT head~vbeln head~auart head~kunnr AS kunag soldto~name1 AS kunag_t head~angdt head~bnddt head~erdat AS erdat_r head~netwr AS netwr_r head~waerk status~gbstk
sdbusiness
~bstkd head~vkorg head~vtweg head~spart head~vdatu status~abstk status~rfstk status~uvals status~uvall
UP TO lv_max ROWS
INTO CORRESPONDING FIELDS OF TABLE lt_docs
FROM (lv_from_clause)
WHERE head~vbeln IN lt_rg_vbeln

...

  METHOD truncate_table.

IF iv_skip IS NOT INITIAL.
DELETE ct_table TO iv_skip.
ENDIF.

IF iv_top IS NOT INITIAL.
DELETE ct_table FROM iv_top + 1.
ENDIF.

ENDMETHOD


.

so the good thing is, only the first n (50) entries are selected, but after several scroll requests, this wil be 1000 and more. ok at least the db query buffer is filled on the second select statement, but still not very db-optimized.

 

regarding db-optimized: i looked at some hana-fioris (XS Server) and they are using SELECT LIMIT 50 OFFSET n Statement, so they can select the DB-Frame of Records required, very nice!

 

This would be nice for ABAP as well. I hoped that the LIMIT/OFFSET feature would be available with the NEW OPEN SQL Expressions, but it did not (yet) work

 

of course i can use NATIVE-SQL to use the feature, with the use of the CL_SQL_STATEMENT-Class: (also with Non-Hana Databases)

 

****Create the SQL Connection and pass in the DBCON ID to state which Database Connection will be used 

  DATA lr_sql TYPE REF TO cl_sql_statement. 

  CREATE OBJECT lr_sql 

    EXPORTING 

      con_ref = cl_sql_connection=>get_connection( 'AB1' ). 

****Execute a query, passing in the query string and receiving a result set object 

  DATA lr_result TYPE REF TO cl_sql_result_set. 

  lr_result = lr_sql->execute_query( 

    |SELECT * FROM SFLIGHT WHERE MANDT = { sy-mandt } AND CARRID = 'LH'limit 200 offset 0 | ). 

****All data (parameters in, results sets back) is done via data references 

  DATA lr_sflight TYPE REF TO data. 

  GET REFERENCE OF lt_sflight INTO lr_sflight. 

****Get the result data set back into our ABAP internal table 

  lr_result->set_param_table( lr_sflight ). 

  lr_result->next_package( ). 

  lr_result->close( ). 



Then i remember the new ALV IDA with Integrated Data Access and here i see a nice feature:  (Also with non-Hana Databases)

 

data:

IS_RESTRICTIONS Type  IF_SADL_QUERY_ENGINE_TYPES=>TY_RESTRICTIONS,

IS_AGGREGATION  Type  IF_SADL_QUERY_ENGINE_TYPES=>TY_AGGREGATION,

IT_SORT_ELEMENTS Type  IF_SADL_QUERY_ENGINE_TYPES=>TT_SORT_ELEMENTS,

IS_REQUESTED  Type  IF_SADL_QUERY_ENGINE_TYPES=>TY_REQUESTED,

IS_PAGING Type  IF_SADL_QUERY_ENGINE_TYPES=>TY_PAGING,

IS_PARAMETERS Type  IF_SADL_QUERY_ENGINE_TYPES=>TY_PARAMETERS,

EV_NUMBER_HITS  Type  I,

EV_NUMBER_ALL_HITS  Type  I.

 

data: row_count type i.

data: it_ranges type IF_SALV_SERVICE_TYPES=>YT_NAMED_RANGES.

data: wa_range like line of it_ranges.

 

data: lt_sbook type table of sbook.

DATA ms_view_metadata TYPE if_sadl_view_db=>ty_view_metadata.

 

   cl_salv_ida_services=>create_entity_and_abqi(

         exporting iv_entity_id   = conv #( 'SBOOK' )

                   iv_entity_type = cl_sadl_entity_factory=>co_type-ddic_table_view

         importing eo_entity      = data(lo_entity)

                   eo_abqi        = data(lo_abqi) ).

 

 

    data(lo_ida_structdescr) = cl_salv_ida_structdescr=>create_for_sadl_entity(

         io_entity = lo_entity ).

*        io_calc_field_handler = io_calc_field_handler ).

     data(lo_query_engine) = new cl_salv_ida_query_engine( io_structdescr_prov   = lo_ida_structdescr

                                                           io_sadl_engine        = lo_abqi ).

     data(lo_idas) = cl_salv_ida_services=>create( io_structdescr_prov  = lo_ida_structdescr

                                                   io_query_engine       = lo_query_engine ).

 

 

is_paging-start_row = 10.

is_paging-maximum_rows = 20.

 

refresh it_ranges.

wa_range-name = 'CARRID'.

wa_range-option = 'EQ'.

wa_range-sign = 'I'.

wa_range-low = 'LH'.

append wa_range to it_ranges.

 

lo_idas->get_query_engine( )->set_selection_range_tab( it_ranges = it_ranges ).

 

LO_ABQI->select(

               EXPORTING "is_text_search   = ls_text_search

                         "is_aggregation   = VALUE #( count_alias = l_count_alias )

                         is_requested     = VALUE #( "fill_number_all_hits = abap_FALSE

                                                     "elements = t_group_by_fields

                                                      elements = VALUE #( ( `CARRID`  ) ( `FLDATE`  ) ) fill_data = abap_true )

                         "is_parameters    = ms_parameters

         is_paging            = is_paging

               IMPORTING "ev_number_all_hits = row_count ).

                         et_DATA_ROWS         = LT_SBOOK ).




finally, the IDA does nothing else but calling CL_SQL_STATEMENT with SELECT LIMIT n OFFSET x


i hope you also can use this :-)


ODATA GATEWAY SINGLE SIGN-ON

$
0
0

                                                                               

   Below are the screen shots for making the gateway service as single sign-on:

    Once application is developed in SEGW, use T-code /IWFND/MAINT_SERVICE for registering the service

 

first.png

 

 

   Click on “Add Service” button to add/register the service. Below screen will appear once you click on button

 

second.png

 

   Fill all the required fields with service class name and press Enter.

    Service name will appear in ‘Select Back-end Services’ section.

    Click on service name, one popup will appear as per the below screen shot.

 

third.png

As per the above screen shot, you will find last row as ‘OAuth enablement’. Which will be unchecked by default. Click on check box to make the service available on Single Sign-On.

Above all the steps are to make ODATA service single sign-on.

OData analytics in Virtual Reality

$
0
0

With Virtual Reality building up steam (Oculus, Hololens, Samsung Gear, crystal cove, Morpheus) it's becoming easier and easier to get tempted into experimenting to see if we can use it for business applications.

 

The idea here is to see how difficult it would be to consume SAP data in virtual reality and render it into objects, so we can analyze the data in more dimensions than the flat ones we typically leverage on a screen or paper.

 

The objective of my little experiment is to generate a bubble chart in virtual reality (just to see if we can do it).


We should: 

  • Deliver an OData service describing business partners by a couple of dimensions
  • Consume the OData service in a virtual reality application.
    • To deliver the application I'm using  Unity (there's a free version!)
    • To see the data in virtual reality,I'm using  Oculus (buy one, it's awesome!)
  • Instantiate our data by interpreting the service & generating spheres
    • Their size would represent sales volume
    • Their color would represent risk
  • Structure the data to make it interpretable
    • By risk
    • By potential

 

The OData service

 

If you frequent forums like these, you can probable figure out how to do this yourself, so I'm not going to describe it in detail. Suffice to say I've got a service set up that provides business partner data in the following form:

 

Service.png

 

A nice OData service that delivers thousands of business partners, their sales volume, risk (low/medium/high), potential (low/medium/high) and location (haven't figured out what to do with this yet). Any SAP Netweaver system can deliver this now that we've got Gateway. You could host this on a BW system, or as I have done, straight on the operational system (in my case: SAP CRM).

 

VR environment

 

This is where game development merges with business application development. For the uninitiated, this might be a little tricky, but honestly it is a lot easier than you might expect (IT always is, isn't it). We need:

  • An environment to walk around in and check out our data
  • The person walking around and doing the checking out
  • Some light so he can actually see something
  • Some reference points, so we don't get lost

 

Translated in Unity, that means:

  • We need a plane in sphere with inverted normals (so we can see it)
  • A first person. This guy does the walking and the looking around, which is available in standard Unity. There's also an API to integrate with Oculus, which is just another "first person" object but with 2 camera viewpoints instead of one.
  • A directional light in the sphere (our sun basically)
  • For our reference point, I've got a pool set up. That's right, a pool! (the water is just mesmerizing in VR!)

 

It looks like this:

 

World.png

 

A guy on a plane in a sphere with a light staring at a pool: great going so far.

 

When plugging in the API from Oculus (put on the Oculus HMD and you can start feeling dizzy already!), it looks like this:

 

WorldVR.png

 

A guy on a plane in a sphere with a light staring at a pool with goggles on: awesome!

 

Now to get some data in it


This is where Unity scripting comes into play. Unity offers a choice here. I've gone with C# (easier interaction with external libraries).

 

We need 2 scripts:

  • One to capture the user event, load and instantiate our data.
  • One associated with the objects themselves to provide them with color, size and location (and to destroy them of course).

 

First script

The first script is associated with one of the objects in the environment (the pool!) or the person itself. In the "Update" routine of Unity (called every couple of milliseconds), we detect user interaction. In this case, when the user taps the "h" button:

 

// Instantiate based on external data 
if (Input.GetKeyDown ("h")) {
 StartCoroutine ("LoadOData"); 
}

That starts a coroutine (separate thread basically) to load the data:

 

IEnumerator LoadOData() {    //Load JSON data from a URL    string url = "file://c:/test/Data";    WWW BPdata = new WWW(url);    yield return BPdata;    if (BPdata.error == null)    {        //Sucessfully loaded the JSON string        Debug.Log("Loaded following JSON string" + BPdata.text);        //Process the data in JSON file        ProcessData(BPdata.text);    }    else    {        Debug.Log("ERROR: " + BPdata.error);    }
}

This gets us our data. I'm loading it from a local file (still got some CORS issues to figure out with the OData service), which is just a dump of the service data in JSON format. The code loads that data in a coroutine and pushes it as a string in a processor. The processor looks like this:

 

private void ProcessData(String JsonString)
{    JsonData BpString= JsonMapper.ToObject(JsonString);    Debug.Log (BpString["d"]["results"].Count);    float minVolume = 0;    float maxVolume = 0;    for (int i = 0; i<BpString["d"]["results"].Count; i++)    { if ( minVolume == 0)        { minVolume = float.Parse(BpString["d"]["results"][i]["Salesvolume"].ToString());}        if ( int.Parse(BpString["d"]["results"][i]["Salesvolume"].ToString()) < minVolume )        { minVolume = float.Parse(BpString["d"]["results"][i]["Salesvolume"].ToString());}        if ( int.Parse(BpString["d"]["results"][i]["Salesvolume"].ToString()) > maxVolume )        { maxVolume = float.Parse(BpString["d"]["results"][i]["Salesvolume"].ToString());}    }  for(int i = 0; i<BpString["d"]["results"].Count; i++)    {        Vector3 pos = new Vector3(UnityEngine.Random.Range(-6.00F,6.00F),0.5f,UnityEngine.Random.Range(-6.00F,6.00F));        GameObject ball = Instantiate(myInstBall, pos, Quaternion.identity) as GameObject;        ball.name = "Ball" + i;        MyScript = ball.GetComponent<BallAttributes_C>();        MyScript.partnerID = int.Parse(BpString["d"]["results"][i]["Partner"].ToString());        MyScript.salesVolume = int.Parse(BpString["d"]["results"][i]["Salesvolume"].ToString());        MyScript.salesVolumeC = (float.Parse(BpString["d"]["results"][i]["Salesvolume"].ToString())/ maxVolume );        MyScript.risk = BpString["d"]["results"][i]["Risk"].ToString();    }
}

At the start of the routine, I'm leveraging a mapper of an external library (LITJson). It allows me to flexibly address the objects.

 

Before instantiating I'm determining the minimum and maximum sales volume of all business partners. There's no knowing which figures to expect, so I'm rendering the sphere's size in reference to a minimum and maximum, which is the same thing as what MS excel is doing when rendering a graph when you think of it.

 

I'm then looping over all objects (business partners) and instantiating a prefab which I've created before. The prefab is no more than a sphere that I've defined as a prefab and which I've assigned to the variable "myInstBall". I'm giving the sphere a random XYZ coordinate in a range between -6.00 and 6.00 (which just defines an area: the pool). I'm then retrieving a script (our second one) that I've assigned to the prefab and which is instantiated along with the spheres. The script contains the variables of the business partners that (the value of which) I'm passing to it.

 

Easy enough. This gives us our data in the form of spheres, floating around in our virtual space:

 

Graph.png

 

3D Object instantiation in a virtual room based on external JSON data from an OData service! Hurray!

 

Second script

 

As you'll notice, the spheres have a color and a size. I'm using the second, object-dependent script to change the appearance of the spheres based on the values of the variables I've passed along during instantiation.

 

The color is changed as follows:

 

switch (risk) {   case "High":        gameObject.renderer.material.color = Color.red;        break;   case "Medium":        gameObject.renderer.material.color = Color.gray;        break;   case "Low":        gameObject.renderer.material.color = Color.green;        break;
}

The size like this:

 

if (salesVolumeC != 0) {     gameObject.transform.localScal += Vector3.one * salesVolumeC;
}

That gives us a 3D bubble chart, representing sales volume (as the size of the sphere) and risk (as the color). Put the HMD on and you can (literally) step into your data. Nice as this is, we should take it a little further

 

Structuring the data

 

Having our spheres confronts us with the reason why 2D representation is so useful. Interpretation of what we see becomes trickier in 3D. So let's see if we can structure our data a little bit better:

 

order-balls.png

 

What I've done here is to structure the data in 9 sections representing the cross-sections of risk (Low/Medium/High) and potential (Low/Medium/High). I'm reusing the risk dimension that I've used to color the spheres to illustrate the point.

 

This is what it looks like if bars are used instead of spheres:

 

Order-bars.png

 

 

I do admit that it took me a while to get this done, but once you figure out all the components, it is easier than you think. To me at least, it illustrates that we should not be bound to our flat UIs to build business applications and it opens up the door to a range of possibilities (imagine combining it with the speed of HANA), both in business application design or in data analytics.Thinking it through, it allows us to marry up game and business application development in a strange and lovingly weird cross-breed that opens up perspectives on all sides.

 

Business cases should be found of course, no discussion there, but who wants to be bound by those ;-)

 

As with all experiments: all feedback is very much welcome. I've started a ghost blog on the topic (where I'm providing more details), which can be found here: https://dataanalyticsinvr.ghost.io/

SAP NetWeaver Gateway violates IETF RFC 3986 => How to fix parsing of correctly encoded OData system query option '!deltatoken'

$
0
0

What happened?


Today I tried to make a request to one of my SAP NetWeaver Gateway services that looked like this:

 

http://<gateway-url>:<gateway-port>/sap/opu/odata/<my-service>/<my-entityset>?%21deltatoken=%27token%27

 

But when I saw the response I wondered why the delta mechanism of my service wasn't working. So I started to debug the request and I noticed that in the method `/iwbep/if_mgw_appl_srv_runtime~get_entityset` of my data provider class `io_tech_request_context->get_deltatoken( )` returned an empty deltatoken (`rv_deltatoken` was initial).


First I thought I made a mistake and it's not allowed to escape URL query parameter names, but when I started research I came across an useful answer on a github issue which pointed me to IETF's RFC (Request for Comments) 3986 named "Uniform Resource Identifier (URI): Generic Syntax). Section 2.2 of this RFC points out:

2.2. Reserved Characters

 

...

 

  If data for a URI component would conflict with a reserved character's purpose as a delimiter, then the conflicting data must be percent-encoded before the URI is formed.

 

...

 

reserved = gen-delims / sub-delims

 

gen-delims = ":" / "/" / "?" / "#" / "[" / "]" / "@"

 

sub-delims = "!" / "$" / "&" / "'" / "(" / ")" / "*" / "+" / "," / ";" / "="

 

 

quotation from IETF RFC 3986

 

What does that mean?

 

As SAP Gateway officially uses/supports the OData Version 2.0 protocol and OData Version 2.0 is based on HTTP, SAP NetWeaver Gateway also has to respect and implement the IETF RFCs according HTTP. Instead today it's not possible to use e.g. correctly encoded OData system query options (which are URL query parameters) like %24top ($top) or %24skip ($skip).%21deltatoken (!deltatoken).

 

How to fix this NOW?

 

The best way I can think of now to fix this (until SAP hopefully will fix this!) is to make the following enhancement at the end of the private method `INIT_REQUEST` in class `/IWFND/CL_SODATA_PROCESSOR`:

 

ENHANCEMENT 1  /Z_ENH_FIX_PARSING.    "active version  DATA: lt_uri_parameter TYPE TIHTTPNVP.   FIELD-SYMBOLS: <fs_ls_uri_parameter> LIKE LINE OF lt_uri_parameter.   CALL METHOD mo_context->get_parameter   EXPORTING     iv_name = /iwfnd/if_sodata_types=>gcs_iwf_context_parameters-query_parameters   IMPORTING     ev_value = lt_uri_parameter.
 * ENHANCEMENT for url query parameter unencoding bug (fixing only for deltatoken parameter by now):
 * =================================================================================================
 * The problem is that all parameter names (and values) in `mo_context->get_parameter(
 * /iwfnd/if_sodata_types=>gcs_iwf_context_parameters-query_parameters` are
 * still encoded and therefore parameters with correctly encoded parameter names don't
 * get used by SAP Gateway. Example of SAP Gateway's URL query parameter parsing code
 * from above:
 *
 * | READ TABLE lt_uri_query_parameters INTO ls_dollar_parameter
 * |   WITH KEY name = /iwfnd/if_sodata_types=>gcs_uri_parameters-delta_token_tmp.
 *
 * Problem in code above:
 * ----------------------
 * `/iwfnd/if_sodata_types=>gcs_uri_parameters-delta_token_tmp` is '!deltatoken'
 * and the parameter name in `lt_uri_query_parameter` is '%21deltatoken' (still encoded).   READ TABLE lt_uri_parameter ASSIGNING <fs_ls_uri_parameter> WITH TABLE KEY name = '%21deltatoken'.   IF sy-subrc = 0.     FIELD-SYMBOLS: <fs_ls_export_parameter> LIKE LINE OF et_parameter.     APPEND INITIAL LINE TO et_parameter ASSIGNING <fs_ls_export_parameter>.     <fs_ls_export_parameter>-name = /iwfnd/if_sodata_types=>gcs_uri_parameters-delta_token.     <fs_ls_export_parameter>-value = cl_http_utility=>unescape_url(         escaped = <fs_ls_uri_parameter>-value       ).     SHIFT <fs_ls_export_parameter>-value RIGHT DELETING TRAILING `'`.     SHIFT <fs_ls_export_parameter>-value LEFT DELETING LEADING ` '`.   ENDIF
ENDENHANCEMENT..

Now you just need to add all the other the parameter names you need to be parsed or come up with a generic implementation for this. No generic approach is needed as only `!deltatoken` is affected by now. Take a closer look at my first update to this post if you want to know why only `!deltatoken` is affected.

Looking forward to your thoughts on this in the comments. Bye!

 

 

Update 1 on 15.04.2015:

 

As Andre pointed out in his comment, not all system query options are affected. I just debugged some requests and found out that this is because not all system query options are treated equally:

 

The first time during the processing of an request by SAP NetWeaver Gateway the url query parameters get extracted into the context object, is in method `DISPATCH` in class `/IWFND/CL_SODATA_ROOT_HANDLER` starting line 167:

 

*-set parameters - not exposed in read_entry method therefore tunnel via context   lt_query_parameters = io_request->get_uri_query_parameters( ).   io_context->set_parameter(     EXPORTING       iv_name  = /iwfnd/if_sodata_types=>gcs_iwf_context_parameters-query_parameters       iv_value = lt_query_parameters   ).

Now the `io_context`'s attribute `mt_parameter` looks as follows:

 

NAMEVALUE
IWFND/QUERY_PARAMETERSall query parameters encoded in a standard table type TIHTTPNVP
<other parameters set to the context>

 

Some lines of code later in method `DISPATCH` in class `/IWCOR/CL_DS_HDLR_ROOT` starting line 43 :

 

lt_parameter = io_request->get_uri_query_parameters( iv_encoded = abap_false ).
lo_uri = /iwcor/cl_ds_uri_facade=>parse_uri(         io_edm = lo_edm         iv_resource_path = lv_resource_path         it_query_parameter = lt_parameter       ).

Now the `io_context`'s attribute `mt_parameter` looks as follows:

 

NAMEVALUE
IWFND/QUERY_PARAMETERSall query parameters encoded in a standard table type TIHTTPNVP
~uri

all query parameters decoded in a structure containing only one component:

It's named `INSTANCE` and is a `REF TO /IWCOR/CL_DS_URI`.

<other parameters set to the context>

 

The processing is going on and then in method `/IWCOR/IF_DS_PROC_ENTITY_SET~READ` in class `/IWFND/CL_SODATA_PROCESSOR` starting line 63 the method `INIT_REQUEST` mentioned in my orginal post yesterday gets called:

 

* Initialization       me->init_request(         EXPORTING           iv_operation         = mcs_operations-read           iv_operation_type    = mcs_operation_types-feed           io_entity_set        = io_entity_set           it_key               = it_key           it_navigation_path   = it_navigation_path           it_expand            = it_expand           it_select            = it_select           io_filter            = io_filter           io_orderby           = io_orderby           iv_skip              = iv_skip           iv_top               = iv_top           iv_format            = iv_format           iv_for_ds_operation  = iv_for           iv_skiptoken         = iv_skiptoken           iv_inlinecount       = iv_inlinecount         IMPORTING           es_request           = ls_odata_request           et_parameter         = lt_dolar_parameter           eo_target_entity_set = lo_target_entity_set       ).

Looking up the stacktrace the variables `iv_skip`, `iv_top`, ... get populated using the corresponding attributes from the instance of `/IWCOR/CL_DS_URI` stored in the `io_context`'c `mt_parameter` table. Because the parameters handed to the constructor of `/IWCOR/CL_DS_URI_FACADE` (the class actually referenced to in `mt_parameters`, subclass of `/IWCOR/CL_DS_URI`) were correctly decoded, the constructor could detect them as system query options and stored the values in it's corresponding attributes.

 

Now we take a look in method `INIT_REQUEST` in class `/IWFND/CL_SODATA_PROCESSOR` if there is any code accessing the still encoded parameters stored in `mt_parameters`'s `IWFND/QUERY_PARAMETERS` (Note: I added comments including the line numbers at which the code snippets can be found in the method):

 

* NOTE: line 455
* Get uri parameters     mo_context->get_parameter(       EXPORTING         iv_name  = /iwfnd/if_sodata_types=>gcs_iwf_context_parameters-query_parameters     " Name       IMPORTING         ev_value = lt_uri_query_parameters     ).
 * Search     CLEAR ls_dollar_parameter.     READ TABLE lt_uri_query_parameters INTO ls_dollar_parameter       WITH KEY name = /iwfnd/if_sodata_types=>gcs_uri_parameters-search.     IF sy-subrc EQ 0.
* NOTE: Setup search     ENDIF.
* NOTE: line 478
* startIndex for search     CLEAR ls_dollar_parameter.     READ TABLE lt_uri_query_parameters INTO ls_dollar_parameter       WITH KEY name = /iwfnd/if_sodata_types=>gcs_uri_parameters-startindex.     IF sy-subrc EQ 0.       INSERT ls_dollar_parameter INTO TABLE et_parameter.     ENDIF.
* delta token     CLEAR ls_dollar_parameter.     READ TABLE lt_uri_query_parameters INTO ls_dollar_parameter       WITH KEY name = /iwfnd/if_sodata_types=>gcs_uri_parameters-delta_token_tmp.     IF sy-subrc EQ 0.
* NOTE: Setup delta token     ENDIF.
* NOTE: line 512 
* totals     CLEAR: es_request-technical_request-totals.     READ TABLE lt_uri_query_parameters INTO ls_dollar_parameter         WITH KEY name = /iwfnd/if_sodata_types=>gcs_uri_parameters-totals.     IF sy-subrc = 0.
* NOTE: Setup totals     ENDIF.

So the following four parameters wouldn't be recognized if their encoded representation is different to the decoded one:

 

NAMEVALUE
/iwfnd/if_sodata_types=>gcs_uri_parameters-searchsearch

/iwfnd/if_sodata_types=>gcs_uri_parameters-startindex

startIndex
/iwfnd/if_sodata_types=>gcs_uri_parameters-totalstotals
/iwfnd/if_sodata_types=>gcs_uri_parameters-delta_token_tmp!deltatoken

 

Now we can clearly see why only the parameter `!deltatoken` is affected. It has nothing to do with the fact that it's using a '!' instead of the other system query options which are using a '$'. It's just a coincidence

 

 

How would I fix this, working at SAP?

 

I would add `deltatoken` as variable to `/IWCOR/CL_DS_URI` which gets extracted from the decoded parameters during object creation. Then add the import parameter `iv_deltatoken` to the method `INIT_REQUEST` in class `/IWFND/CL_SODATA_PROCESSOR` and use this variable in there instead of implement my own parsing using `READ TABLE`.

 

I would NOT change the value of `io_context`'s `mt_parameter`'s `IWFND/QUERY_PARAMETERS` from encoded parameters to decoded parameters as it can't be definitely said that this wouldn't cause side effects in other SAP coding or already existing customer coding!

 

 

Note: All line numbers were fetched on SAP_GWFND 740 SPS 09

Implementing an Odata service for Elementary search help

$
0
0


SAP has  introduced support for search help as a data source since version SAP NetWeaver Gateway 2.0 SPS 08. This blog servers as a step by step guide to implemented an Odata service for elementary search help using service builder.

 

Before we being with the steps for creation, below are some of the limitations for this service as noted on the help.sap portal page for gateway development.

  • Only elementary search helps are supported. Collective search helps are not supported.
  • Search helps that call UI in their exit function are not supported, as this cannot be validated by the Service Builder.
  • At runtime, a query operation that is based on a search help data source can retrieve a maximum of 9,999 entries.

 

For this blog i am going to use a simple, custom search help with two input and out parameters. The hitlist upon execution displays the equipment type code and description. The search help interface is shown in the figure attached belowShlp_interface.png

 

  1. Create a new project and select Data Model>Import>Search Help to bring up the entity creation wizard. On step 1 of the wizard choose 'Search help' from the Data Source Attribute> Type dropdown and enter or select the search help for which you need to create oData service. Click Next when done.

 

Step1_1.png

Step1_2.png

 

2.     In step 2 of the wizard, select the structures from search help result table that you want to use for modeling your entity. On the next screen set the properties you want as keys and click finish. As a rule of thumb you should select all properties that have 'IMP' flag set in the search help definition as your keys of the entity type definition ( only properties marked as 'Key' can be used to import values from the consumer client ).

Step2_1.png

Step2_2.png


     Check your service for consistency.

Step2_3.png

3.     Create EntitySet for the entity type created in previous step.

Create_entity_set_1.png

4.     Map the datasource (Searchelp) to the Read and Query operations of the Entity Set created in previous step. All Entity type properties marked as Key in step2 have to be mapped ( in addition to proposed mapping ) as import parameters when mapping datasource for Read query.

          * Only the Query and Read operations can be mapped for entity sets derived from the search help data source

Step4_1.png

 

Step4_2a.png

Step4_3a.png

5.     Generate the runtime artifacts by clicking on the 'Generate Runtime Objects' button and check the project consistency by clicking the check button.

Step5_a.png


6.     Register your service on all systems which have the GW_CORE component installed. Remember that the External service name will be default to the technical service name.

Step6.png

7.     Using the netweaver gateway client ( on your gateway hub system or local - depending on architecture) you can test the service to ensure it works as expected.

 

successful gateway query.png

8. If you leave development at this stage  you will notice that filtering ( using $filter ) will not work. I think this is a bug, if someone notices differently please leave a comment and I will remove the portion hereon.

To enable filtering for Entity set data, the 'EnttitySetName_GET_ENTITYSET' method of the Data provider class extension class (ZCL_ZCA_TEST_EQUIP_TYP_DPC_EXT->EQUIPMENTTYPECOL_GET_ENTITYSET for this example) is redefined and the following code needs to be added in.

To redefine the method copy code from the data provider class and modify it with the following code snippets.

 

In the data declaration section :

Filter_enh1.png

 

 

****** Added to handle missing code for filter functionality *********

  DATA: ls_filter_select_options LIKE LINE OF lt_filter_select_options.

  DATA: ls_select_option_values  TYPE /iwbep/s_cod_select_option.

 

 

Just before the call for search help values is made using method 

me->/iwbep/if_sb_gendpc_shlp_data~get_search_help_values( )

Filter_enh2.png

 

 

**** Add missing code to allow for filtering of records **********

  IF NOT lt_filter_select_options[] IS INITIAL.

    LOOP AT lt_filter_select_options INTO ls_filter_select_options.

      ls_selopt-shlpname  = 'ZPMH_TAR_EQART'.

      ls_selopt-shlpfield = ls_filter_select_options-property.

      LOOP AT ls_filter_select_options-select_options INTO ls_select_option_values.

        ls_selopt-sign     = ls_select_option_values-sign.

        ls_selopt-option   = ls_select_option_values-option.

        ls_selopt-low      = ls_select_option_values-low.

        ls_selopt-high     = ls_select_option_values-high.

        APPEND ls_selopt TO lt_selopt.

      ENDLOOP.

      CLEAR ls_selopt.

    ENDLOOP.

  ENDIF.

 

 

Now if you try filtering the oData result set you will get only relevant response as shown in the figure below.

filter_enh_result.png


Hybrid OData Implementation Example

$
0
0

Hello Fellow SCNers,

 

I am writing this blog to demonstrate an example of how we can use Hybrid Odata in our Backend Developments for a POST(create) scenario.

This demonstration uses NetWeaver Gateway Foundation 7.4 SP 0008.

 

 

What is Hybrid OData?

Generally when we need to create/post multiple data records in a single backend table(like create multiple Sales Orders, multiple Purchase Orders, etc.) we use a 'Batch' and encapsulate multiple POST request payloads in it. When there are few records say around 1K then also due to huge data traffic the UI user can sense the latency in processing. Imagine what will be the case when its something like mass creation on materials, users, etc. in the system where data records can easily exceed by at least 10 times.

 

For such cases we can also explore the Hybrid OData development approach. It means sending the huge data (may be an entire file contents with thousands of records) to a single string property in an OData entity. This can be done by stringifying the data from file into JSON format and sending. Then a single POST on this entity can create all the records.

 

Example

 

Scenario: Here I have a flat file which has user details which the end user can upload from user interface and at backend the users (records from that file) should be created. They should be visible in SU01 transaction. Also for every user we need to assign/revoke authorizations for multiple applications.

 

OData Service -

 

Create a service (here ZCDP_ASSESSMENT_MANAGE_USERS).

SEGW - OData Service.png

 

Create an entity(here UserUpload)

 

SEGW - Entity Details.png

 

The entity has a single property of data type 'String'. This will hold the stringified JSON sent from user interface.This is shown below-

SEGW - Entity Property Details.png

The corresponding data dictionary structure is (here ZCDP_ASSMNT_S_USER_DET) is

Entity Structure.png

 

Now one needs to read this Stringified JSON data and convert its contents to Internal Table so that further processing can be undertaken.

This can be done by using Transformations (http://help.sap.com/abapdocu_70/en/abapcall_transformation.htm))

 

Code as below -

 

In the Data Provider Extension Class, in the Create method of entity code as

DPC_EXT code.png

 

 

The data should look like below in debug mode -

 

Debug Data.png

 

[I will give payload details below]

 

Till this step we have read the data sent from UI. Now as mentioned we need to convert this to the Internal table so that normal ABAP processing can be done.

The type ZCDP_ASSMNT_S_USER_LIST will be shown below.

Convert Stringified JSON to Internal Table.png

 

Now we will see the parsed data in debug mode. Also, I have used a nested scenario that is every user record has multiple applications records whose access can be given/withdrawn from the user.

 

Table Contents.png

Nested Internal Table.png

 

and nested table contents for a user

 

Nested Internal Table contents.png

 

The actual table structures used above is ZCDP_ASSMNT_S_USER_LIST shown below

 

Actual File structure 1.png

and nested table structure is

 

Actual File structure 2.png

 

Post this all the contents in an internal table and normal ABAP processing can be done.

 

 

In case one needs to re-convert the internal table to stringified JSON and pass to UI, use below transformation code -

 

Declare the object of XML Writer class

Data Declartion Rev Trans 1.png

Instantiate the object

Instantiate  Rev Trans 2.png

Do the reverse transformation and parse the internal table into XString.

Then convert the XString to a string and pass back to UI.

Code Rev Trans 3.png

*--------------------------------------------------------------------- That's It Folks -----------------------------------------------------------------*

The Payload used for testing is

 

{

"UsersDetails": "{\"USER_DET\":[{\"BNAME\": \"EMP9000\",\"MANAGER\": \"ABCD111\",\"KOSTL\": \"1100110011\",\"T_LEVEL\": \"EMP\",\"FIRSTNAME\": \"Employee\", \"LASTNAME\": \"9000\",\"FULLNAME\": \"Employee9000\",\"SMTP_ADDR\": \"EMP9000@DUMMY.COM\",\"LOCATION\": \"GURGAON\",\"DISABLED\": \"X\",\"PROCESSING_STATUS\": \"\",\"ERROR_TEXT\": \"\",\"APPLICATIONS\": [{\"BNAME\": \"EMP9000\",\"APP_ID\": \"MANAGE_CATEGORIES\",\"APP_MODE_INDICATOR\": \"X\",\"PROCESSING_STATUS\": \"\",\"ERROR_TEXT\": \"\"},{\"BNAME\": \"EMP9000\",\"APP_ID\": \"ASSIGN_EVALUATORS\",\"APP_MODE_INDICATOR\": \"X\",\"PROCESSING_STATUS\": \"E\",\"ERROR_TEXT\": \"Error ASSIGN_EVALUATORS\"}]},{\"BNAME\": \"EMP9901\",\"MANAGER\": \"ABCD111\",\"KOSTL\": \"1100110011\",\"T_LEVEL\": \"EXE\",\"FIRSTNAME\": \"Employee\", \"LASTNAME\": \"9901\",\"FULLNAME\": \"Employee9901\",\"SMTP_ADDR\": \"EMP9901@DUMMY.COM\",\"LOCATION\": \"GURGAON\",\"DISABLED\": \"X\",\"PROCESSING_STATUS\": \"\",\"ERROR_TEXT\": \"\",\"APPLICATIONS\": [{\"BNAME\": \"EMP9901\",\"APP_ID\": \"MANAGE_CATEGORIES\",\"APP_MODE_INDICATOR\": \"X\",\"PROCESSING_STATUS\": \"\",\"ERROR_TEXT\": \"\"},{\"BNAME\": \"EMP9901\",\"APP_ID\": \"ASSESSMENT_RESULTS\",\"APP_MODE_INDICATOR\": \"X\",\"PROCESSING_STATUS\": \"E\",\"ERROR_TEXT\": \"Error in user 2 ASSIGN_EVALUATORS\"}]}]}"

}

 

* Points to note - The double quotes is a special character in JSON and it needs to be escaped. This is done by backslash character(\).

 

Hope this infuses some new development ideas. Looking forward for valuable feedback.

Developing high-quality OData services (1) Hints on common problems

$
0
0

I would like to share some of my experiences in developing OData services using SAP Gateway in this article, and I would be glad if a discussion evolved out of it. Let´s start with an overview of common problems and some hints on how to avoid them. I see those hints as an addition to the Best Practices published by SAP.

 

Problem 1: Requirements and domain knowledge – just start coding!?

 

That problem is not specific to OData service development, it is an issue for almost every software developer. In contrast to a traditional ABAP developer, a backend service developer may have to work separately from the front-end developers building the client that the service interacts with. That may lead to unnecessary errors and iterations to adapt your code to requirements revealed to you step-by-step and too late.

 

Hints:

  • Start working without precise requirements if necessary, but make it clear to everyone from the beginning that the risk of errors and the development effort rises without precise specification.
  • If you are not an expert in the respective domain and SAP module yet, ask for someone who can answer questions on domain terms, transaction codes and tables.
  • Precise requirements are no contradiction to an agile working mode where more and more user stories are added over time, as long as expectations for each user stories are clear and precise.

 

Problem 2: Service Modelling – a picture is worth a thousand words

 

The Gateway Service Builder is an effective tool for modelling OData services with SAP Gateway – but only if you already have a picture of the entity set types and there relations ships in mind. If not, you will likely get lost as soon as you have to define associations, referential constraints and navigation properties, and even more when you start implementing navigation in service operations.

 

1_1_SEGW.png

Gateway Service Builder (Transaction Code SEGW)

 

The SAP-internal data model can be very complex, comprising dozens of business object types with hundreds of attributes and allowing for special cases which may not be needed in your application. If you do not take the chance to simplify the data model exposed by your service, then your service will be harder to understand and to work with for client developers. Moreover, it will show suboptimal performance, as more data will have to be transmitted.

 

Hints:

  • Visualize the service model. Start with pen and paper. If it get´s too complex or a "neat" documentation is needed, use a tool such as the OData modeler
  • Expose as few properties as possible for each entity set, get rid of everything which is not required for sure, it will be rather easy to add it later.
  • Before exposing the SAP-internal data model "as is", try to find out whether several business objects in SAP can be combined into one for your application safely.

2_OData_Modeler.png

SAP OData Modeler for Eclipse


Problem 3: Implementation – to RFC or not to RFC

 

There is some serious criticism against it in SCN, and it is still so true: On of the biggest misunderstandings in Gateway service development is: "There are standard BAPIs for the operations, and you just have to wrap them". Such statements are mostly made by consultants who never looked at the details, or by developers who never worked on complex services yet.

 

Typical reasons which prevent simple wrapping of existing RFC modules are from my experience:

  • Missing fields
  • Missing filter / select options
  • Unspecific error messages in case of business errors – e.g. "Message E … can not be processed in plugin mode HTTP"
  • Unexpected errors during processing by the Gateway framework, e.g. due to empty date fields or unexpected Commit Work.

 

An unnecessarily long runtime is another drawback of RFC module calls, as far more data is read from the DB than required, but for the first version of the service that may be acceptable. And there are also specific weaknesses of the RFC/BOR generator when used without custom coding, some of which are:

  • Occasional errors in data types of properties (when used for modelling entity types)
  • No chance to handle more than one line of table parameters.

 

However, the approach of using existing RFC modules and possibly the generator as starting point also has some strengths: It gives you a very quick start with data access using water-proof business logic, some error handling, readiness for Hub deployment and paging (skip/top). Apart from the implementation, I do consent to modelling entity types using the RFC/BOR generator. I believe that´s fine if you plan to use RFC modules as a basis for implementing one or more service operations and if you thoroughly select and rename some properties to have consistent and clear names throughout your model.

 

3_RFC_Generator.pngGeneration of service operation by mapping of RFC Modules with RFC/BOR Generator


Hints:

  • Never believe that everything will be quick and easy when wrapping existing RFC modules, but expect some manual effort to make them work as required.
  • If a remote-enabled function module for a service operation is available (standard or customer), then a flexible option is to quick-start with the RFC/BOR generator, copy the generated code from the DPC class, re-implement the method in the DPC_EXT class, paste the code there and then adapt and amend it according to your requirements.

 

Problem 4: Testing – is green really green?


The Gateway Client is included in SAP Gateway as service testing tool and you should use it – but will you only test HTTP response codes. Particularly with reading operations, it may easily happen that you return a response with missing attribute values due to some mistake in your service implementation. As long as there is no runtime exception the HTTP status code will be 200, and the Gateway Client Test will be green.

 

4_Gateway_Client.png

Gateway Client /IWFND/GW_CLIENT - will test HTTP response codes, but not body content


Hints:

  • Create a test case in the Gateway Client for every relevant operation in your service in the development system and run all of them before transporting changes to test system. Do the same in the test system before transporting to the production system.
  • Use the export / import functionality for test cases (in menu Gateway Client) to transfer them from one system to another. In the test system, select only the read operation test cases and export them for import in the production system.
  • Make sure that someone checks response content after every relevant change to your code – either yourself, your colleagues, your customer or an automated test tool.

 

Problem 5: Performance – oh my…

 

Let´s face the truth: Performance is often a weakness of SAP systems – and there are some good reasons for it, such as extensive checks of user authorization and input data consistency. However, as service developer you have to do your best to keep response times in an acceptable magnitude. Especially the Query operation of a service (Get entity set) can cause problems. If the implementation is bad in terms of performance, already a few hundred or thousand of datasets in the result set may lead to response times of several minutes.

 

Hints:

  • If performance is not satisfactory, first check your service model for any properties which are not used by the clients and delete them after you copied it to a backup project.
  • Review and refactor the implementation of the operations with long response times, most likely you will find room for improvement. If that´s not sufficient, runtime analysis with transaction SAT may help (that´s a story for a forthcoming article).
  • Instead of having clients ask for updated data every 5 minutes: do opt for a Push Service (only if the clients have fixed IP addresses / host names or if you have SAP Mobile Platform).
  • Ask client developers to request JSON format instead of XML whenever possible. As a rule of thumb you can expect shorter response times for all requests returning about 100 entities or more.
  • Check SAP Note1801618 for recommended values of system profile parameters. That note is intended for Gateway Hub systems. In case of embedded deployment, you can still use it to find out whether some of the parameter values in your system are far too low for optimal performance.

You will find more hints in Gateway Performance Best Practices - How to consume Gateway from performance point of view by David Freidlin.


Problem 6: Cry for re-use

 

A number of tasks in service implementation will occur over and over again, e.g. reading filter criteria, get object key values, writing log entries, throwing business exceptions with messages, implementing paging with $skip and $top. Coding that stuff again and again would be a waste of time and mental capacity. Copy & Paste from existing projects is a possible solution, but it´s not particularly convenient and it´s one of the top root causes for errors in code.

 

Hints:

  • For tasks which can not efficiently be extracted to separate methods, extract code templates including the required DATA declarations. If you have no idea yet, you may take code from the RFC/BOR generator as a starting point, e.g. for:
    • Reading filter values
    • Extracting key values
    • Determining lv_skip and lv_top values for paging
    • Throwing business exceptions and collecting messages
    • Creating timestamps from separate date / time fields and vice versa.
  • However, do not use templates to duplicate sections of identical code, as it will be laborious and error-prone to make changes later on! Rather extract it to separate classes, e.g. for:
    • Reading and writing longtexts (DB access and conversion to/from table are always the same, key composition is object-type specific and can be done in sub-classes)
    • If you have to read from several related DB tables repeatedly, create a table View in SE11 (gives you an inner join, not applicable for left outer join).

6_Create_Template.png

Editing a simple code template for timestamp creation


Problem 7: Documentation

 

Service documentation should at least comprise information on the contained entities and relationships, on the meaning of entity properties and on the relation of entity properties to fields visible to SAP dialog users. If you do provide that information, either you or somebody else will likely spend more time later on understanding the service before being able to work with it than you would have spent documenting it. The OData standard provides some degree of service self-description, however compiling a complete documentation still requires quite some manual effort, which you should take into account when planning service development projects.

 

Hints:

  • Get the OData modeler for Eclipse and import the modelfrom the Service Catalog of your SAP Gateway system to produce a nice diagram showing entity types and their relations as shown above for problem 2.
  • In addition to the property names of entity types, do enter a short but easily comprehensible description in the "label" column for each entity and for each property in the Gateway Service Builder (cf. Figure 1). This will be included in the response when clients request the metadata document of your service.
  • For documenting the relation of entity properties to fields visible to SAP dialog users I personally work with screenshots from the Gateway Service Builder and SAP dynpros – see screenshot below for an example.

MaintenanceOrder_Properties.png

Portion of a screenshot from Gateway Service Builder as part of documentation of entity type properties

 

IW33 Header Data.png

Screenshot from SAP transaction with references to entity type properties as part of the service documentation

 

What´s your experience?

 

Which problems and solutions did you encounter in modelling, implementation, test and documentation of OData services with SAP Gateway? Let´s discuss them.

Step by Step Gateway Odata Service Creation Using SEGW - method 1. importing DDIC structure

$
0
0

Here I am exposing a sap table as odata using gateway service builder.TCODE is segw.

 

Steps

1.Go to TCODE SEGW.Then click on create icon
Fill Details Project name ,Description and choose package click tick button.A new project will be created.

 

2.Expand the Node Select Data Model node and right click select import-->DDIC structure

 

3.Enter name of the entity(give any reasonable name) and select abap structure.Here i have selected a custom table created by me

 

4.Select which all fields to be exposed.Tick mark needed fields and click next button

 

5.Select the Key fields in the table and Click finish button

 

6.Now click on Generate button to generate model provider classes and data provider class.This will be generated automatically by the system.

 

7.Click tick button.

 

8.Expand Service Maintenance node and click register button.

 

9.Select system alias.Here iam selecting LOCAL.Click tick mark button.

 

10.If you want to change service name you can do it here.Iam not gonna change it.Select local object and press OK button.Now press maintain button

 

11.Already Now you can view your service in the browser or gateway client.

 

12.Wow See the result in gateway client.Note that http status is 200 ok.If you get other than 200 You have done something wrong.         


If you need Screenshot and more details Please visit my blog

Step by Step Gateway Odata Service Creation Using SEGW - method 1. importing DDIC structure | sap abap | …                

SAP NetWeaver Gateway ODATA and RFC Performance Testing (Mobility) – Simplified

$
0
0

SAP-NetWeaver-Gateway-ODATA-RFC-Performance -Testing.jpg
This blog will show you how to test the ODATA and RFC for ABAP Program easily and effectively. I have created simple step-by-step guide on SAP backend testing useful for SAP testers.

 

Remote Functional Call (RFC) Testing:

In the steps below I will discuss on how to perform RFC performance test as simple as possible.

 

 

Step 1:

    Open a SAP GUI Launchpad and login to access the Q & A (Quality Assurance) server. SAP-GUI- Launchpad-and -login-to-access -the-Q-&-A.jpg

Step 2:

    Enter a Transaction code SE37 or SE80 (ABAP Development Workbench) in command field to access packages and function modules.ABAP-Development-Workbench.jpg

Step 3:


Open your function Group and select a RFC and execute. A RFC detail page will be displayed.
function-Group- for-to-select-a-RFC.jpg

Function-Module.jpg

Steps 4:

Enter the required inputs in the Import Parameters section.

 

 

Steps 5:

After entering the inputs to execute the RFC, an output with the respective return message will be displayed.  

 

 

The below sections will provide step-by-step testing methods to understand in-depth details on run-time analysis and performance.

 

 

Transaction Code Run-time Analysis Testing:
SE30 – Transaction code for “Runtime Analysis”.

 

 

Entering this TCODE in command field will display Hit List, Hierarchy, Tables accessed during run time, execution time of all statements, FM call and methods.Run-time-Analysis-Testing.jpg

Run-time-Analysis-Display-Measurement.jpg

Hit List

  • Displays the execution time of each statement in the program

 

Hit-List.jpg

Tables

  • Displays the tables accessed during run time

Run-time-Analysis-Tables.jpg

Hierarchy

  • Displays the execution time of each statement in the actual order in which they were executed. Uses indentation to indicate the level of nesting of statements within subroutines

Run-time-Analysis-Display-Measurement-Hierarchy.jpg


Performance Analysis Testing:

ST05

SQL trace is a performance analysis tool for tracking and analyzing a SAP ABAP program to collect performance statistics related to SQL statement execution. Analyze trace to validate performance. 

 

 

Select Trace Type:

  • Select the respective trace mode
  • Select either SQL or Trace, RFC Trace, or both

 

Now, click on the Activate Trace on the Application Toolbar to activate the trace type or click Deactivate Trace to deactivate the trace type selected. Start or stop trace recording when required.

 

Trace files are managed by the system and can be saved, like any other object.  Display or delete saved files when needed. 

 

The below steps will explain how to start, stop and display trace selected.

 

Starting the Trace

Prerequisites

You can start the Performance Trace merely a single instance. Decide on the scope and targets of your performance analysis beforehand starting the trace.

 

 

Procedure

Activate a trace from App Toolbar. A status message with “user name and selected trace type Activated” will be displayed in status box.

 

Performance -Trace.jpg

Stopping the Trace

Prerequisites

For performance reasons, stop the traces as soon as you have finished recording.

Procedure

Deactivate the trace. A status message with “All Traces are Switched Off” will be displayed. Deactivate - Trace.jpg

Display Filter

After deactivating the performance trace, open the display filter after selecting “Display Trace” in App toolbar.

 

Prerequisites

Use the display filter (Set Restrictions for Displaying Trace) to restrict the number of logged trace records displayed on the Trace List, detailed list, or time-sorted list.

If you do not enter any selections, all of the trace records are selected.

 

Specifying the Trace Type to Display

Default trace type is SQL trace. If you start the display filter directly after the recording, the trace type appears as it was last configured.

SQL-Trace- Records.jpg

Display Trace List

Execute the trace type to display the trace records.

Prerequisites

Select the Display Trace function with specified value range in the Display Filter.

 

Lists

Complete trace lists will be displayed.

 

The first line of the list contains a Sub-header with name of the transaction, process identification number, process type, client, and user name.

The next line contains Duration, Execution time of the statement headers. The time is displayed in the format of milliseconds, microseconds.

 

RFC-Performance-Testing.jpg

Practice the above simple steps to do effective RFC Performance Testing.

ODATA Testing:

ODATA is a standardized protocol built on existing HTTP and REST protocols supporting CRUD (Create, Read, Update and Delete) operations for creating and consuming data APIs. Here I will discuss on how to test an ODATA simply.

 

 

Step 1:

Install Advanced Rest client chrome extension in browser and enter the basic login credentials (SAP login info).SAP-login-info.png


Step 2:
 

Paste the ODATA service URL in the respective area and select method “GET” or “POST”. After selection, click Send button to display the Response of the  URL.
ODATA-service-URL.jpg

Step 3:

The output contains Data fetching time, status of the result and ODATA content from the response body.

Performance analysis of ODATA:
Use the TCODE - /n/iwfnd/traces – to access the performance details like Average Times (in Milliseconds), data fetching method, requested response, load of ODATA.

 

SAP-NetWeaver-Gateway-Tracing-Tools.png

Prerequisites

RESTclient login credentials and NetWeaver Workbench credentials should be same.

Procedure

Activate Performance Trace and Payload Trace in the Configuration Tab and click Save Configuration. Run the ODATA in RESTclient. After completing execution, click Performance Trace Tab in the NetWeaver Workbench.

 

SAP-NetWeaver-Gateway-Tracing-Tool.png

To view complete details on a specific ODATA, click on the respective ODATA URL.SAP-NetWeaver-Gateway-Performance-Trace.png

Click on the Payload Trace Tab to track the flow of data sent based on requests and responses.

SAP-NetWeaver-Gateway-Tracing-Tool-Payload-Trace.png

After configuring and activating Payload Trace, a table displaying service call info, the method used, transaction ID, date and time of service execution with expiry date of trace, status of service call will be displayed.SAP-NetWeaver-Gateway-Payload-Trace.png

The steps provided above will be convenient in SAP RFC and ODATA performance testing for ABAP programs.

Simple Steps to Develop ODATA Using Global Classes in CREATE_DEEP_ENTITY Method Operation with JSON Format

$
0
0

Simple-Steps-to-Develop-ODATA-Using-Global-Classes-in-CREATE_DEEP_ENTITY-Method-Operation-with-JSON-Format.jpg

This blog will show you how to develop ODATA using Global Classes in CREATE_DEEP_ENTITY Method Operation with JSON Format using simple steps.

 

Step 1:

Create Custom Global Classes with below Structures.

 

Go To TransactionSE11 and create structures as displayed in the below screenshots.

 

Create Table Type ZPHYSINV_ITEMS_TAB.

Include the line type Structure.

 

ODATA-display-Table-Type.png


Structure 1:

BAPI_PHYSINV_CREATE_ITEMS

 

ODATA-dictionary-display-Structure.png

Structure 2:

BAPI_PHYSINV_CREATE_HEAD
ODATA-dictionary-display-Structure-2.png

Structure 3:

Create Table Type ZRETURN_TAB.

Include the line type Structure ZBAPIRET.

ODATA-dictionary-display-Table.png

ODATA dictionary display Structure.png

Go to Transaction SE24 and create a global class with following parameters.

Global Class: ZCL_MOBIM_ICC

ODATA-Class-Builder.png

Parameters for Method: CREATE INVENTORY_DOCUMENT

ODATA Class Builder-Display.png

After creating structures and class in SE24 with above given parameters, paste the below Code into the method CREATE INVENTORY_DOCUMENT.

 

Code Snippet:

 

method create_inventory_document.

*--Data Declaration

  data: ls_head      type bapi_physinv_create_head,

ls_maxitems  type am07m-maxpo value 300,

        lt_item      type standard table of bapi_physinv_create_items,

        ls_item      type bapi_physinv_create_items,

        lt_return    type standard table of bapiret2,

        ls_return    type bapiret2,

        wa_return    type zbapiret.

 

  data: lv_error_detected(1) type c,

        lv_text type bapi_msg,

        lv_flag.

 

*--Pass the Header Details

  ls_head-plant      = im_header-plant.

  ls_head-stge_loc   = im_header-stge_loc.

ls_head-doc_date   = im_header-doc_date.

ls_head-plan_date  = im_header-plan_date.

 

*--Passing the Multiple Items

  loop at im_item into ls_item.

    append ls_item to lt_item.

  endloop.

 

*--Create Inventory Documents

  call function 'BAPI_MATPHYSINV_CREATE_MULT'

    exporting

      head     = ls_head

      maxitems = ls_maxitems

    tables

      items    = lt_item

      return   = lt_return.

 

*--Reading Succefully Messages

  read table lt_return into ls_return with key type = 'E'.

  if sy-subrc ne 0.

    call function 'BAPI_TRANSACTION_COMMIT'

      exporting

        wait = '1'.

    loop at lt_return into ls_return where type eq 'S'.

      call function 'FORMAT_MESSAGE'

        exporting

          id        = ls_return-id

          lang      = sy-langu

          no        = ls_return-number

          v1        = ls_return-message_v1

          v2        = ls_return-message_v2

          v3        = ls_return-message_v3

          v4        = ls_return-message_v4

        importing

          msg       = lv_text

        exceptions

          not_found = 1

          others    = 2.

      clear: wa_return.

wa_return-type    = 'S'.

wa_return-message = lv_text.

      append wa_return to ex_return.

    clear: ls_return, wa_return, lv_text .

    endloop.

  else.

    lv_flag = 'X'.

  endif.

 

*--Reading Error Messages

  if lv_flag eq 'X'.

    call function 'BAPI_TRANSACTION_ROLLBACK'.

    loop at lt_return into ls_return where type eq 'E'.

      call function 'FORMAT_MESSAGE'

      exporting

        id        = ls_return-id

        lang      = sy-langu

        no        = ls_return-number

        v1        = ls_return-message_v1

        v2        = ls_return-message_v2

        v3        = ls_return-message_v3

        v4 = ls_return-message_v4

      importing

        msg       = lv_text

      exceptions

        not_found = 1

        others    = 2.

      clear: wa_return.

wa_return-type    = 'E'.

wa_return-message = lv_text.

      append wa_return to ex_return.

      clear: ls_return, wa_return, lv_text.

    endloop.

  endif.

  1. Endmethod.

 

Now you have created the Global Classes. Now let’s start developing ODATA in Net Weaver gateway Client SEGW for above Class.

 

 

Step 2: Create Project in SEGW

 

As I mentioned earlier we will implement the CREATE_DEEP_ENTITY for multiple postings with single request like one Header with multiple Line items.

 

In the example below, I will show posting with one Header with multiple Line items.

Create Entity Types, Entity Set, Association and Navigations as follows.

 

Create two Entity Types and Entity Sets.

 

Entity Type-1 - ICC_Header

Entity Type-2 - ICC_Items

 

Entity Set-1- ICC_HeaderSet

Entity Set-2- ICC_ItemsSet

ZMOBOIM-ICC.png

Properties of ICC_Header

ODATA-ICC-Header.png

Properties of ICC_Items

ODATA-ICC-Items.png

Step 3:

Create Association and Navigation.

Create Associations as shown below.

 

Association - Assoc_ICCHeader_ICCItems

ODATA-Association-ICC-Items.png

Create Navigation  as shown below.

 

Navigation – ICC_Items

SAP-NetWeaver-Gateway-Service-Builder.png

Now generate runtime artifacts. Once generation is successful, you will get 4 classes. Two for Data provider classes and two for Model provider classes. Implement the code in Data provider class as shown below.

 

Double click on the Class ZCL_MOBOIM_ICC_DPC_EXT. Go to ABAP Workbench.

ODATA MOBOIM ICC DPC EXT.png

Start the Gateway client by calling transaction /IWFND/GW_CLIENT.

Enter the following URI to test your implementation.

Append $metatda to base service URI.

 

/sap/opu/odata/sap/ZMOBOIM_ICC_SRV/?$metadata

 

If everything works perfectly then HTTP response will be displayed as shown below with value 200.

SAP NetWeaver Gateway Client.png

Check service document append $format=json.

ODATA SAP NetWeaver Gateway Client.png

Step 4:

Implementing the CREATE_DEEP_ENTITY Method.

 

Right click and redefine the Create_Deep_Entity Method as shown in below Screenshot.

Create Deep Entity Method.png

Now double click on CREATE_DEEP_ENTITY Method. Paste the code below.

ODATA Repository Browser.png

Code Snippet:

 

method /iwbep/if_mgw_appl_srv_runtime~create_deep_entity.

 

*--Types Declaration for Inventory Cycle Document

  types: ty_t_iccitem   type standard table of zcl_zmoboim_icc_mpc=>ts_icc_items with default key.

 

*--Represents Inventory Cycle Document structure - header with one of more items

  types: begin of ty_s_icc.  "Deep Structure for Inventory Cycle Document

          include type zcl_zmoboim_icc_mpc=>ts_icc_header.

  types: icc_items type ty_t_iccitem,    "Name Should be as Entity Type Name

         end of ty_s_icc.

 

*--Data Declaration for Inventory Cycle Document

  data: ls_icc type ty_s_icc,

        ls_iccitem type zcl_zmoboim_icc_mpc=>ts_icc_items,

        ls_header  type bapi_physinv_create_head,

        it_item    type standard table of bapi_physinv_create_items,

        wa_item    type bapi_physinv_create_items,

        wa_icc_items like line of ls_icc-icc_items.

 

  constants: lc_iccitem   type string value 'ICC_Items'.

  constants: lc_countitem type string value 'Count_Items'.

  constants: lc_diffitem  type string value 'PostDiff_Items'.

 

*--Data Declaration for Return Message

  data: lv_compare_result type /iwbep/if_mgw_odata_expand=>ty_e_compare_result,

        lt_return  type zreturn_tab,

        ls_return  type zbapiret.

 

  case iv_entity_name.

   when 'ICC_Header'.

 

*--Validate whether the current request including the inline data matches

lv_compare_result = io_expand->compare_to( lc_iccitem ).

*-- Upon match, access data from IO_DATA_PROVIDER

      io_data_provider->read_entry_data( importing es_data = ls_icc ).

 

*--Pass the Header Data

ls_header-plant      =  ls_icc-plant.

ls_header-stge_loc   =  ls_icc-stge_loc.

ls_header-doc_date   =  ls_icc-doc_date.

ls_header-plan_date  =  ls_icc-plan_date.

 

      clear: ls_iccitem.

 

*--Pass the Item Data

      loop at ls_icc-icc_items into ls_iccitem.

move-corresponding ls_iccitem to wa_item.

        call function 'CONVERSION_EXIT_MATN1_INPUT'

          exporting

            input  = ls_iccitem-material

          importing

            output = wa_item-material.

 

        append wa_item to it_item.

        clear: wa_item, ls_iccitem.

      endloop.

 

*--Create Invertory Cycle Counting (Custom Global Class)

      call method zcl_moboim_icc=>create_inventory_document

        exporting

          im_header = ls_header

          im_item   = it_item

        importing

          ex_return = lt_return.

 

      if lt_return is not initial.

        loop at lt_return into ls_return .

          move-corresponding ls_return to wa_icc_items.

          modify ls_icc-icc_items from wa_icc_items index sy-tabix transporting type message.

        endloop.

      endif.

 

copy_data_to_ref(

      exporting

        is_data = ls_icc

      changing

        cr_data = er_deep_entity

        ).

      clear ls_icc.

 

   when others.

  endcase.

  1. Endmethod.

 

 

 

Step 5: 

Create a Request Payload.

 

HTTP  Request in JSON Format:

 

HTTP Method   : POST

X-CSRF-Token : Fetch

Content-Type     : application/json

 

JSON  Format Payload:

 

{

"d":{

"Plant":"1100",

"StorageLoc":"1000",

"DocDate":"2015-07-21T00:00:00",

"PlanDate":"2015-07-21T00:00:00",

"ICC_Items":[

{

"Material":"2144",

"Batch":" ",

"StockType":" ",

"AltUnit":" "

},

{

"Material":"2145",

"Batch":" ",

"StockType":" ",

"AltUnit":" "

}

]

}

}

 

 

 

 

Step 6: 

Post the data with JSON format.

ODATA with JSON format.png


Put external break point and ensure whether the values are fetched in the structure or not.

 

JSON format-external break point.png

JSON format-IT-ITEM.png

Come out from the debugging.

Records will be created and HTTP response value 201 will be created as displayed below.

ODATA HTTP response.png

That’s all. You’re done with it!!!

 

I know the above steps will be useful in developing ODATA with Global Classes in Net Weaver gateway. Suggestions and comments for this post will be appreciated.

Cost Effective SAP NetWeaver Gateway Trial

$
0
0

    There are several ABAP 7.4 Cloud trials that include Gateway (as Andre Fischer mentioned in his blog). Some of them are listed on Developer & Trail editon page. Each of this Cloud trial is free from licensing prospective, but has associated Cloud usage costs (differ depending on resources used). The cheapest are any db (not HANA) options like: AS ABAP incl. BW 7.4 SP8 on SAP ASE 15.7  ($0.39/h) and BW 7.4 SP2 on SAP MaxDB 7.9 ($0.51/h). What is missing in the list is the most cost effective option: SAP NetWeaver Application Server ABAP 7.4 on SAP MaxDB [trial edition] ($0.26/h). Cloud image for this tiral does not come with fully configured Gateway, but it easy to fix maintaining SAP NetWeaver Gateway Settings.


NPL1.jpg

AS ABAP incl. BW 7.4 SP8 on SAP ASE


NPL2.jpg

BW 7.4 SP2 on SAP MaxDB 7.9


NPL3.jpg

NPL4.jpg

SAP NetWeaver Application Server ABAP 7.4 on SAP MaxDB

 

 

 

SAP NetWeaver Gateway Settings Maintenance

 

Trx.SPRO  SAP Customizing Implementation Guide -> Activate Business Functions -> SAP NetWeaver -> Gateway Service Enablement -> Backend OData Channel -> Connection Settings to SAP NetWeaver Gateway -> SAP NetWeaver Gateway Settings


NPL5.jpg

 

Once Gateway is configured you can start you journey of exploring SAP NetWeaver Gateway. Here are two documents I recommend to follow to learn cleation of your own OData Services:

Let’s code CRUDQ and Function Import operations in OData service!

Let's code association/navigation and data provider expand in OData service!

 

 


Hybrid OData Implementation Example

$
0
0

Hello Fellow SCNers,

 

I am writing this blog to demonstrate an example of how we can use Hybrid Odata in our Backend Developments for a POST(create) scenario.

This demonstration uses NetWeaver Gateway Foundation 7.4 SP 0008.

 

 

What is Hybrid OData?

Generally when we need to create/post multiple data records in a single backend table(like create multiple Sales Orders, multiple Purchase Orders, etc.)we use a 'Batch' and encapsulate multiple POST request payloads in it. When there are few records say around 1K then also due to huge data traffic the UI user can sense the latency in processing. Imagine what will be the case when its something like mass creation on materials, users, etc. in the system where data records can easily exceed by at least 10 times.

 

For such cases we can also explore the Hybrid OData development approach. It means sending the huge data (may be an entire file contents with thousands of records) to a single string property in an OData entity. This can be done by stringifying the data from file into JSON format and sending. Then a single POST on this entity can create all the records.

 

Example

 

Scenario: Here I have a flat file which has user details which the end user can upload from user interface and at backend the users (records from that file) should be created. They should be visible in SU01 transaction. Also for every user we need to assign/revoke authorizations for multiple applications.

 

OData Service -

 

Create a service (here ZCDP_ASSESSMENT_MANAGE_USERS).

 


SEGW - OData Service.png

Create an entity(here UserUpload)

 

 

SEGW - Entity Details.png

The entity has a single property of data type 'String'. This will hold the stringified JSON sent from user interface.This is shown below-

 

SEGW - Entity Property Details.png


The corresponding data dictionary structure is (here ZCDP_ASSMNT_S_USER_DET) is

 

 

Entity Structure.png

Now one needs to read this Stringified JSON data and convert its contents to Internal Table so that further processing can be undertaken.

This can be done by using Transformations (http://help.sap.com/abapdocu_70/en/abapcall_transformation.htm))

 


Code as below -

 

In the Data Provider Extension Class, in the Create method of entity code as

 

 

DPC_EXT code.png

The data should look like below in debug mode -

 

 

Debug Data.png

[I will give payload details below]

 

Till this step we have read the data sent from UI. Now as mentioned we need to convert this to the Internal table so that normal ABAP processing can be done.

The type ZCDP_ASSMNT_S_USER_LIST will be shown below.

 

 

Convert Stringified JSON to Internal Table.png

Now we will see the parsed data in debug mode. Also, I have used a nested scenario that is every user record has multiple applications records whose access can be given/withdrawn from the user.

Table Contents.png

 

 

Nested Internal Table.png

and nested table contents for a user

 

Nested Internal Table contents.png

 

 

The actual table structures used above is ZCDP_ASSMNT_S_USER_LIST shown below

 

Actual File structure 1.png

 

and nested table structure is

 

 

Actual File structure 2.png

Post this all the contents in an internal table and normal ABAP processing can be done.

 

 

In case one needs to re-convert the internal table to stringified JSON and pass to UI, use below transformation code -

 

Declare the object of XML Writer class

 


Data Declartion Rev Trans 1.png

Instantiate the object


Instantiate  Rev Trans 2.png

 

Do the reverse transformation and parse the internal table into XString.


Then convert the XString to a string and pass back to UI.

 

Code Rev Trans 3.png

*--------------------------------------------------------------------- That's It Folks -----------------------------------------------------------------*

The Payload used for testing is

 

{

"UsersDetails":"{\"USER_DET\":[{\"BNAME\": \"EMP9000\",\"MANAGER\": \"ABCD111\",\"KOSTL\": \"1100110011\",\"T_LEVEL\": \"EMP\",\"FIRSTNAME\": \"Employee\", \"LASTNAME\": \"9000\",\"FULLNAME\": \"Employee9000\",\"SMTP_ADDR\": \"EMP9000@DUMMY.COM\",\"LOCATION\": \"GURGAON\",\"DISABLED\": \"X\",\"PROCESSING_STATUS\": \"\",\"ERROR_TEXT\": \"\",\"APPLICATIONS\": [{\"BNAME\": \"EMP9000\",\"APP_ID\": \"MANAGE_CATEGORIES\",\"APP_MODE_INDICATOR\": \"X\",\"PROCESSING_STATUS\": \"\",\"ERROR_TEXT\": \"\"},{\"BNAME\": \"EMP9000\",\"APP_ID\": \"ASSIGN_EVALUATORS\",\"APP_MODE_INDICATOR\": \"X\",\"PROCESSING_STATUS\": \"E\",\"ERROR_TEXT\": \"Error ASSIGN_EVALUATORS\"}]},{\"BNAME\": \"EMP9901\",\"MANAGER\": \"ABCD111\",\"KOSTL\": \"1100110011\",\"T_LEVEL\": \"EXE\",\"FIRSTNAME\": \"Employee\", \"LASTNAME\": \"9901\",\"FULLNAME\": \"Employee9901\",\"SMTP_ADDR\": \"EMP9901@DUMMY.COM\",\"LOCATION\": \"GURGAON\",\"DISABLED\": \"X\",\"PROCESSING_STATUS\": \"\",\"ERROR_TEXT\": \"\",\"APPLICATIONS\": [{\"BNAME\": \"EMP9901\",\"APP_ID\": \"MANAGE_CATEGORIES\",\"APP_MODE_INDICATOR\": \"X\",\"PROCESSING_STATUS\": \"\",\"ERROR_TEXT\": \"\"},{\"BNAME\": \"EMP9901\",\"APP_ID\": \"ASSESSMENT_RESULTS\",\"APP_MODE_INDICATOR\": \"X\",\"PROCESSING_STATUS\": \"E\",\"ERROR_TEXT\": \"Error in user2ASSIGN_EVALUATORS\"}]}]}"

}

 

* Points to note - The double quotes is a special character in JSON and it needs to be escaped. This is done by backslash character(\).

 

Hope this infuses some new development ideas. Looking forward for valuable feedback.


OData - Everything that you need to know (Part 1)

$
0
0

Introduction

 

This series of blogs is an attempt to provide a comprehensive guide to OData and how it is consumed in the SAP world. There has been a lot of content on the Internet explaining what is OData, how the services and published and how to consume which is scattered and assumes that the reader has clarity already on some very important fundamentals. However, there are so many aspects which deserves more than just a mention. In this blog, we will try and cover the topic 360 degrees.

 

We start with the very basics of HTTP and then gradually build on to OData and finally conclude with a detailed end-to-end exercise on how to create an OData service and how to consume it.

 

 

How Internet Works?

 

It might sound to be an unrelated topic when the focus is on OData. HTTP is the common thread between the two. OData is the underlying protocol on which OData is based and also one of the most common language which is spoken on the Internet.

 

001.jpg

 

 

 

You would typically need a Web Browser, address of the document you want to read or the URL (Uniform Resource Locator), a Web Server running at the system where the document is located.

 

Web Browser is a software which is responsible to retrieving and presenting resources on the web. Web Browser parses the HTML response received from the Web and prepares a DOM Tree. DOM stands for Document Object Model and it is a language-independent convention of representing objects in HTML. Web Browser acts as an interface between you and the World Wide Web.

 

You can request for a document located somewhere on this planet by providing its address via the browser which sends HTTP requests to the web server. The Web server then sends back HTTP response that is parsed by the Web Browser for you to view the document on your machine.

 

HTTP stands for Hyper Text Transfer Protocol and as the name suggest it defines the semantics of what the browser and the web server can communicate to each other. The how part of the communication or the byte by byte transfer of the data packets across the network is not HTTP but it is done via TCP/IP protocols.

 

An important component in the story so far is also the Web Server. In a nutshell, Web Server is a server program which sits on a physical server and waits for requests from a client. Whenever a request is received the Web Server prepares a response and sends it back to the client using HTTP protocol. It is important to note that HTTP is not the only but, by far most commonly used protocol and that the client here can be a browser or any other software that communicates in HTTP.

 

Let us now try and put up a formal definition to the HTTP.

"HTTP is based on a Client-Server architecture style and it uses a stateless request/response protocol to communicate between the client and the web server. The HTTP protocol helps define what requests can be sent across by the client and response sent by web server. The communication of data over the network is however done using on TCP/IP protocol".

 

It is evident that HTTP is based on client server architecture from our description till now, browser being the client which sends HTTP request and web server which sends the response back to the browser. But, why stateless?

Every single HTTP request that is received by the web server is forgotten after a response has been sent across. Web servers do not process an HTTP request by remembering the previous request. 

 

 

URI and URL are not the same thing

 

We mentioned a while back that you need to know the URL of the document you wish to view in your browser. Can I not mention a URI as an address to get the document on the web? Let us quickly understand the difference between the two.

 

URI or the Uniform Resource Identifier can be classified as a name, locator or both that identifies a resource uniquely.

URL (Uniform Resource Locator) is actually a subset of URI that not only identifies a resource uniquely but also provide means of locating the resource.

 

 

002.jpg

 

Defining the Request and Response

 

HTTP requests are sent with one of the defined Request Method that indicates the action to be taken on the resource. Following are the two most commonly used methods:

 

GET

This is used to retrieve information of the resource from the server.

 

POST

This method is used to send data to the server.

 

For example, if you enter a text string on Google search page and press enter, it will generate an HTTP Request with GET method in the background. On the other hand, if you provide your username/password on a Login page and press enter a POST HTTP request will be send to the server.

 

HTTP Response from the Web Server comes with data and the status code. The status code provides a context to the response. For instance, if you do not correctly provide the resource location, the web browser will send you a response which you are not expecting. Along with the data response comes the status code, known universally in the HTTP world, that explains to the user what could be the reason for an unexpected response.

 

HTTP codes are 3 digit integers.

Code and Description

1xx: Informational

It means the request has been received and the process is continuing.

2xx: Success

It means the action was successfully received, understood, and accepted.

3xx: Redirection

It means further action must be taken in order to complete the request.

4xx: Client Error

It means the request contains incorrect syntax or cannot be fulfilled.

5xx: Server Error

It means the server failed to fulfill an apparently valid request.

 

For example, status code 400 means Bad Request. It means that the server did not understand the request.

 

OData allow creation and consumption of RESTful APIs. In the next blog, I will cover what are RESTful APIs as the next stepping stone towards understanding OData.


Next Blog: OData - Everything that you need to know (Part 2)

OData - Everything that you need to know (Part 2)

$
0
0

In my previous blog, I discussed about how Internet works and gave an introduction to what is HTTP and the difference between URL and URI. IN this blog we will discuss REST based APIs.

 

Previous Blogs in the series:

OData - Everything that you need to know (Part 1)

 

What is REST?

 

REST or REpresentational State Transfer is an architectural style that uses simple and lightweight mechanism for inter-machine communication. It is an alternative to the RPC (Remote Procedure Calls) and Web Services.

 

REST is resource based unlike RPC or SOAP which are action based. In SOAP you will have a request to get material data whereas in RPC, material will be identified as a resource using URI and then use HTTP verbs to determine which operation shall be performed on the resource. It is important to note here that it is possible to have multiple URIs pointing to the same resource.

 

Representation of the resource is the not the resource itself but only a representation.

The term representation is how the resources are manipulated in a rest based architecture

Representations depicts parts of the resource state which are transferred between client and server in mostly JSON or XML format. Client will typically have enough information to manipulate the resource on the server.  For example, if Person is modelled as a resource and there is a service to get contact information of a person then the representation that you will get of that Person would be Name, Address and Phone details in JSON or XML format.

 

 

001.jpg

 

The Six Constraints

 

Following are the six constraints defined which are attributes of a RESTful API.


1. Uniform Interface

Uniform Interface is fundamental to the design of any REST based service. The uniform interface simplifies and decouples the architecture, which enables each part to evolve independently.

What that means is that it should be possible to identify the individual resource in the request, for example using URI. Once the Client has a representation of the Resource then, it should have enough information to update or delete the resource. And that the Client should not assume that any particular action is available on a resource beyond those described in the representation received from the server previously.

 

 

2. Stateless

The Server should not contain any Client state. It is possible to process a request only if they are self-descriptive and that the request should have enough context to act upon. For example, if the Person resource address needs to be updated then it is required that the Client pass on the particular Person resource details in the request for which it has received the representation from the Server in the previous request. If the state has to be maintained, it should be at the Client side.

 

 

3. Client-Server

We have already established this constraint that the RESTful architecture is a Client-Server architecture. Resource representations are transferred between client and server. We should always keep at the back of our minds that the Client of RESTful APIs will not have direct connects to the Resources.

 

 

4. Cacheable

Another constraint on a RESTful API is that any response that is coming from the Sever should be cacheable on the Client side. Caching could be implicit, explicit or negotiable. Implicit and Explicit caching is self-explanatory whereas Negotiable caching means that the Server and Client agree on how long a representation can be cached on the Client.

 

 

5. Layered System

The constraint of Layered System is closely related to the above two constraints of Client-Server and Cacheable. It suggests that the Client of a RESTful API should not assume that there will be a direct connection between the client and the server. There could be multiple layers of software or/and hardware in between the two. The Client need not know whom exactly it is talking to and whether a response is coming from the server or is accessed from a local cache. This improves scalability.

 

 

6. Code on Demand

This constrains suggests that it should be possible that a Server can extend a Client temporarily. It means that Server can transfer logic to the client as representation to be executed at the client.

This is the only optional constraint.

 

 

A RESTful service needs to adhere to all of the above mentioned constraints (except Code on Demand) to be called as a RESTful API.

 

 

Having covered HTTP and REST based services we will cover OData in my next blog.

 

Next Blog: OData - Everything that you need to know (Part 3)

OData - Everything that you need to know (Part 3)

$
0
0

In my previous two blogs, I have covered the basics of how Internet works, HTTP and what are REST based services. So, now that we have laid the foundation we will start with understanding what is OData protocol.

 

Previous Blogs in the series:

OData - Everything that you need to know (Part 1)

OData - Everything that you need to know (Part 2)

 

 

What is OData?

 

"Open Data Protocol (OData) is an open data access protocol from Microsoft that allows the creation and consumption of query-able and interoperable RESTfulAPIs in a simple and standard way".

 

The protocol enables the Clients to publish and manipulate the resource identified by URIs and defined in a data model using simple HTTP messages.

 

To put it in simple words, OData is an open source to exchange data over the Internet. Service hosts the data  and clients can call this service to retrieve the resources and manipulate them. Servers expose one or more endpoints which are services that refers to the resources. Clients need to know this server side endpoints to call the service to query or manipulate the data. The protocol is HTTP based and designed with RESTful mindset which means it follows the constraints to be called as a RESTful service.

 

Since the protocol is HTTP based, any programming language with HTTP stack can be used to consume OData services. Existing Client side libraries can be used to transform the JSON or ATOM payloads from the server into objects making programming simple. On the other hand, many libraries exists on the Server side to generate the payloads in ATOM or JSON from the existing data.

It is important to note that both Client side and Server side development can be in completely different programming languages till the time both are able to communicate via HTTP.

Clients consume the service to query and manipulate the data from OData Services and are also called as Consumers of OData Service.

Similarly, Servers that expose the OData services via endpoints are known as Producers of Odata services.

 

001 (2).jpg

 

So, we now know that in OData protocol, the resources are exposed in two formats; XML based Atom and JSON.

 

A brief description of what is Atom ad JSON.

Atom is a combination of two protocols, Atom Syndication and Atom Publishing protocol. The Atom Syndication Format is an XML language used for web feeds, while the Atom Publishing Protocol (AtomPub or APP) is a simple HTTP-based protocol for creating and updating web resources.

 

JSON stands for JavaScript Object Notation is a light weight data-interchange format. JSON is self-descriptive and easy to use and is completely language-independent.

 

 

Advantages with OData

 

There is a lot of data on the web today but lot of it is locked up  in different specific applications or formats and different to access from outside. Many Organizations have now started exposing data using REST based services however, it is difficult to write applications which works with multiple data sources as each provider will  expose the data in a slightly different way. OData service producer can expose its service along with metadata which contains the semantics for consumption. OData exploits the common formats like XML, Atom and JSON for communication which are commonly understood. Clients can now understand these OData services using generic tools and can combine information from multiple data sources.

Exposing your data with OData services comes with multifold advantages. For example, as we mentioned earlier, as a consumer, you need not worry about the programming language used by producer as long as the services are exposed as OData service.

 

002.jpg.png

 

 

ODBC for the Web

 

 

ODBC (Open Database Connectivity) is a standard API to access the database management systems independent of the database management systems or operating systems. ODBC manages this by adding drivers between the Application layer and the DBMS to translate the queries fired by application into instructions which DBMS can understand.

OData has similarity with the ODBC in a sense that here, OData provides the middleware between producers and consumers to communicate data. There is a uniform way to consume data and is independent of the producer much like ODBC. The fact that OData is based on HTTP RESTful services makes it the ODBC for the Web!

 

 

In my next blog, I will talk about SAP Netweaver Gateway, SAP OData Channel and the main elements of an OData service.


Next Blog: OData - Everything that you need to know (Part 4)

OData - Everything that you need to know (Part 4)

$
0
0

In my previous three blogs, I have covered the basic concepts of HTTP, what are RESTful APIs and their constraints and finally what are Odata service and that they are RESTful services built that uses simple HTTP to communicate between Client and Server.

 

Previous Blogs in the series:

OData - Everything that you need to know (Part 1)

OData - Everything that you need to know (Part 2)

OData - Everything that you need to know (Part 3)

 

OData provides a way to expose the data in a uniform way. Although, Microsoft initially came up with the specification of Odata but, there is a lot of industry support behind OData like, Netflix, eBay, IBM. SAP being one of them. In this blog, we will discuss SAP Netweaver Gateway and how along with OData SAP can cater to multiple channels with a unified data model. In future blogs, we will see how to create OData service in a real-world scenario and explain various aspects of OData during the course.

 

 

SAP Netweaver Gateway - An Introduction

 

For Business Users, ease of access has become critical with the advent of various mobile devices that support end-to-end scenarios and also provides a rich user experience. There is a need to better integrate data, reduce complexity and improve productivity. This has been a challenge for the Developer community to be able to support multiple platforms to consume enterprise data more easily than ever before and on the other hand provide solutions that are scalable and with low TCO (Total Cost of Ownership).

Point-to-point solutions are not the right way as it has lot of drawbacks including duplication of development effort, increased cost, complex landscape, poor scalability and difficult to maintain.

 

SAP Netweaver Gateway is a technology that has the answer to this. It can seamlessly connect devices, platforms and environments to SAP Enterprise Data using the OData services. SAP Netweaver Gateway offers connectivity to SAP Business data using any programming language and without the need of strong SAP development knowledge.

 

A.png

 

 

 

SAP Netweaver Gateway comes with design-time tools to facilitate modeling OData services for consumption. These tools improve user experience and provide automatic connectivity to the SAP backend that reduces development efforts and improves productivity.

 

 

B.png

 

SAP Netweaver Gateway sits on top of the existing SAP Backend infrastructure. There are set of add-on components that needs to be installed on a SAP Backend system to enable SAP Netweaver Gateway services. It is this architecture that enables consumption of SAP data by variety of interfaces such as BAPIs and RFCs

 

SAP Netweaver Gateway Service Builder tools is SAP GUI based modeler to create and publish Odata services using transaction SEGW.

 

C.png

 

There is also an Eclipse-based Odata Modeler tool available. It provides an easier way to model the service that is a much easier way to develop if you are not too familiar with ABAP programming language. The models can be conveniently created here are exported to the development environment for the Gateway.

 

D.jpg

 

 

SAP Annotation with Odata

 

SAP has leveraged the extensibility feature provided by Atom Publishing protocol to add SAP  specific annotations. AtomPub allows you to add your own markups in the service document. SAP adds annotations from the ABAP Data Dictionary, for example, labels to the documents which can then be consumed by the front end application without having to worry about translations in this case.

 

E.png

 

 

In my next blog, I will focus on the various deployment options for SAP Netweaver Gateway.


Next Blog: OData - Everything that you need to know (Part 5)

OData - Everything that you need to know (Part 5)

$
0
0

In my previous blog, we introduced you to the SAP Netweaver Gateway and how it leverages the Odata protocol liberates SAP Business Data for consumption on virtually any platform, environment or device.

 

Previous Blogs in the series:

OData - Everything that you need to know (Part 1)

OData - Everything that you need to know (Part 2)

OData - Everything that you need to know (Part 3)

OData - Everything that you need to know (Part 4)

 

In this blog we will see the various deployment options for SAP Netweaver Gateway and their pros and cons.

 

Before getting into the various deployment options with SAP Netweaver Gateway, it is important to understand the different components.

SAP Netweaver Gateway prior to NW 7.4 had three add-ons namely GW_CORE, IN_FND and IW_BEP. While the first two components were required for Gateway server functionalities, IW_BEP was used for Gateway backend functionalities.

From the onset of NW 7.0 release, all the three components are bundle into a single component SAP_GWFND or Gateway Foundation.

 

There are three possible deployment options to pick from and we will discuss each one of them.

 

 

1. Hub Deployment: Development in the Backend system

  

In this deployment strategy, the SAP Netweaver Gateway is installed on separate SAP machine referred to as Gateway Hub. The OData services are registered and exposed from the Gateway Hub but are developed in the SAP backend system. If SAP Business Suite backend systems are running on NW release prior to 7.4 then component IW_BEP should be installed. For systems running on NW7.4 onwards the SAP_GWFND component contains both Gateway server and backend functionalities as mentioned before.

Advantages of this deployment options are  as follows:

  • There is only a single access point of access to the SAP Backed systems. No direct access from outside world provides enhanced security.
  • Gateway Hub can be a system running on newer release NW 7.31 or NW 7.4 with Backed system running on lower release is perfectly acceptable.
    • Gateway on newer release would mean support for SAP UI5.
    • Supports added authentication features.
  • Direct access to metadata (DDIC) and business data for OData Modeling by backend systems.

  

1.png

 

 

2. Hub Deployment: Development in the Hub

 

Hub Deployment with development in the Gateway Hub is an option where just like the previous option, there is a dedicated Gateway Hub as a separate system from the backend system. Since, all the development related to SAP Netweaver Gateway takes place in the Gateway Hub, backend not necessary should have any Gateway components installed. It is a feasible option if you don’t want to do any kind of developments in the backend system and leverage what is already available.

There are a few disadvantages though:

  • Access to the data source for Odata service development is only limited to existing BAPIs and RFCs.
  • There will be no direct access to the backed dictionary objects to the Gateway Hub where the Odata services are modeled. The access is limited to only to remote access.
  • Having a landscape with dedicated system as Gateway Hub involves additional costs as compared an embedded deployment option. This is true for the above deployment option as well.

  

2.png

 

 

 

3. Embedded Deployment

 

In the Embedded Deployment option, the SAP Netweaver gateway components are installed as add-ons on the SAP backed system itself. Both the Odata modeling and the exposing of the services is done from the backend system. This deployment strategy saves cost as there is no dedicated Gateway Hub. Also, the runtime overhead due to remote calls in the above two deployment options is reduced.

Disadvantages:

  • If there are multiple SAP Business Suite Backed systems in the landscape then, each system will have to have its own installation and configuration of Netweaver Gateway components.
  • Upgrade of backed systems will follow a different cycle than that of Netweaver Gateway.
  • Additional security measures needed as there is no single access point from the backend system to the consumers.

 

3.png

 

In the next blog, we will explain the case study that will be the basis of our future blogs related to end-to-end steps of producing and consuming OData services.


Next Blog: OData - Everything that you need to know (Part 6)

Viewing all 49 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>