Oracle JDE Order-to-Cash app template


Extraction

This app template uses Oracle JD Edwards 8.0 or higher as source system, further referenced in this documentation as Oracle JDE.

Extraction for the system is done using CData Sync. To set-up the extraction, make sure you have a valid license for CData Sync and have installed CData Sync.

System configuration

To be able to retrieve the required data from Oracle JDE, the modules Sales Management and Accounts Receivable are mandatory. Other tables could be used for master data purposes.

To connect via CData, the user establishing the connection must have appropriate read permissions on each of the objects which are being extracted. All fields included in the input tables must be visible for the user extracting the data.

Setting up the source connection in CData Sync

Setting up the destination connection in CData Sync

To set up the destination connection, follow the steps as defined in Create a destination connection (Automation Cloud). Note: If you are using Automation Suite, make sure to select AUTOMATION SUITE as the DELIVERY option and select the correct version of Automation Suite you are using.

Creating the extraction job in CData Sync

To create the Job in CData make sure to follow the steps below.

Advanced job settings

In the Advanced tab in the Job Settings panel, edit the following settings:

If you are using Automation Suite, set the Destination Schema in the Settings panel on the overview tab. Use the schema name you retrieved when you created the destination connection.

Setting up environment variables in the extraction job

Important: Make sure you edit the Pre-job Event.

CData Sync allows the use of environment variables in order to drive specific extraction logic in each query.

Variable Description Comment
start_extraction_date Defines first date for which data will be extracted. Mandatory
end_extraction_date Last date for which data will be extracted.

In order to setup the environment variables:

Step Action
1 Go to the Events tab in the Job Settings panel of the job you are creating
2 Edit the Pre-Job Event section to add the code displayed below after <!-- Code goes here -- >. Do not modify the api:info details that are shown by default.
3 Click on Save Changes.
<!-- Modify environment variables here. --> <!-- Variable start_extraction_date and end_extraction_date must be populated.--> <!-- The following values are just an example (start being = 01-01-1900 and end being = 01-01-2300).--> <api:set attr="out.env:start_extraction_date" value= 000001 /> <api:set attr="out.env:end_extraction_date" value= 400001 /> <api:push item="out" />

By default, end_extraction_date will default to today's date. start_extraction_date must always be populated. The dates should be formatted in julian format, as this is the format used in JDE internally. Use the following to calculate the correct value:

To transform 31/12/2021 into a JDE julian format (CYYDDD is an integer) C = First 2 digits from the year minus the 19 (JDE uses 01-01-1900 as base to create date fields), in our example: C = 20 - 19 = 1 YY = The last 2 digits of the year, in our example: YY = 21 DDD = Is the day of the year (1-365 range, except for leap years that is 1-366), in our example: DDD = 365 So 31/12/2021 in JDE format is 121365.

Setting up the data ingestion call

After the data has been successfully loaded, the Process Mining platform needs to start the data ingestion process. This is done by calling the End of Upload API. In order to set this up, follow the steps below:

Step Action
1 Go to the Events tab in the Job Settings panel of the job you are creating
2 Edit the Post-Job Event section to add the code displayed below after <!-- Code goes here -- >. Do not modify the api:info details that are shown by default.
3 Fill out the End of Upload API with the value provided.
4 In case you need a debugging log for this call, uncomment the two lines provided in the code below and provide a location for the log file.
5 Click on Save Changes.
<api:set attr="http.url" value="END_OF_UPLOAD_API"/> <!-- <api:set attr="http.verbosity" value="5"/> --> <!-- <api:set attr="http.logfile" value="D:/mydir/cdata_log.txt"/> --> <api:call op="httpPost" in="http"/>

Table Replication

Once the job is correctly setup, go to Task tab, click + Add Tasks, enable the Custom Query option and paste the following queries (each query needs to maintain the semicolon at the end). Make sure you save all changes.

REPLICATE [F0005] SELECT [DRKY] AS [drky], [DRRT] AS [drrt], [DRSY] AS [drsy], [DRDL01] AS [drdl01] FROM [JDECTL920].[F0005]; REPLICATE [F0006] SELECT [MCMCU] AS [mcmcu], [MCDL01] AS [mcdl01], [MCRP01] AS [mcrp01] FROM [JDEDATA920].[F0006]; REPLICATE [F0010] SELECT [CCCO] AS [ccco], [CCCRCD] AS [cccrcd], [CCNAME] AS [ccname] FROM [JDEDATA920].[F0010]; REPLICATE [F0014] SELECT [PNPTC] AS [pnptc], [PNPTD] AS [pnptd] FROM [JDEDATA920].[F0014]; REPLICATE [F0015] SELECT [CXAN8] AS [cxan8], [CXCRCD] AS [cxcrcd], [CXCRDC] AS [cxcrdc], [CXRTTYP] AS [cxrttyp], [CXEFT] AS [cxeft], [CXCRR] AS [cxcrr] FROM [JDEDATA920].[F0015] WHERE ([CXEFT] >= '{env:start_extraction_date}') AND ([CXEFT] <= '{env:end_extraction_date}'); REPLICATE [F0092] SELECT [ULAN8] AS [ulan8], [ULUSER] AS [uluser] FROM [JDESY920].[F0092]; REPLICATE [F0101] SELECT [ABAN8] AS [aban8], [ABALPH] AS [abalph], [ABMCU] AS [abmcu], [ABAT1] as [abat1] FROM [JDEDATA920].[F0101]; REPLICATE [F0116] SELECT [ALAN8] AS [alan8], [ALEFTB] AS [aleftb], [ALCTY1] AS [alcty1], [ALCOUN] AS [alcoun], [ALADDS] AS [aladds], [ALCTR] AS [alctr] FROM [JDEDATA920].[F0116]; REPLICATE [F03012] SELECT [AIAN8] AS [aian8], [AICO] AS [aico], [AICPGP] AS [aicpgp], [AIAC11] AS [aiac11] FROM [JDEDATA920].[F03012]; REPLICATE [F4101] SELECT [IMDSC1] AS [imdsc1], [IMLITM] AS [imlitm] FROM [JDEDATA920].[F4101]; REPLICATE [F40203] SELECT [FSDCTO] AS [fsdcto], [FSTRTY] AS [fstrty], [FSSTDS] AS [fsstds], [FSLNTY] AS [fslnty] FROM [JDEDATA920].[F40203]; REPLICATE [F40205] SELECT [LFLNTY] AS [lflnty], [LFLNDS] AS [lflnds] FROM [JDEDATA920].[F40205]; REPLICATE [F03b11] SELECT [RPKCO] AS [rpkco], [RPDOC] AS [rpdoc], [RPDCT] AS [rpdct], [RPDIVJ] AS [rpdivj], [RPDGJ] AS [rpdgj], [RPVDGJ] AS [rpvdgj], [RPPOST] AS [rppost], [RPTORG] AS [rptorg], [RPDICJ] AS [rpdicj], [RPSFX] AS [rpsfx], [RPAAP] AS [rpaap] FROM [JDEDATA920].[F03B11] WHERE ([RPUPMJ] >= '{env:start_extraction_date}') AND ([RPUPMJ] <= '{env:end_extraction_date}'); REPLICATE [F03b14] SELECT [RZPYID] AS [rzpyid], [RZKCO] AS [rzkco], [RZDOC] AS [rzdoc], [RZDCT] AS [rzdct], [RZDCTG] AS [rzdctg], [RZDICJ] AS [rzdicj],[RZUPMJ] AS [rzupmj], [RZTORG] AS [rztorg], [RZRC5] AS [rzrc5] ,[RZSFX] AS [rzsfx], [RZCKNU] AS [rzcknu] FROM [JDEDATA920].[F03B14] WHERE ([RZUPMJ] >= '{env:start_extraction_date}') AND ([RZUPMJ] <= '{env:end_extraction_date}'); REPLICATE [F42199] SELECT [SLEMCU] AS [slemcu], [SLDOCO] AS [sldoco], [SLKCO] AS [slkco], [SLDCT] AS [sldct], [SLDOC] AS [sldoc], [SLLNID] AS [sllnid], [SLKCOO] AS [slkcoo], [SLDCTO] AS [sldcto], [SLLTTR] AS [sllttr], [SLSOBK] AS [slsobk], [SLTORG] AS [sltorg], [SLUSER] AS [sluser], [SLUPMJ] AS [slupmj], [SLTDAY] AS [sltday], [SLNXTR] AS [slnxtr], [SLAN8] AS [slan8], [SLSHAN] AS [slshan], [SLAEXP] AS [slaexp], [SLFEA] AS [slfea], [SLMCU] AS [slmcu], [SLPRP1] AS [slprp1], [SLPRP2] AS [slprp2], [SLCRCD] AS [slcrcd], [SLIVD] AS [slivd], [SLTRDJ] AS [sltrdj], [SLRSDJ] AS [slrsdj], [SLDRQJ] AS [sldrqj], [SLFRTH] AS [slfrth], [SLLNTY] AS [sllnty], [SLLITM] AS [sllitm], [SLSRP1] AS [slsrp1], [SLSRP2] AS [slsrp2], [SLSRP3] AS [slsrp3], [SLSRP4] AS [slsrp4], [SLSRP5] AS [slsrp5], [SLPTC] AS [slptc], [SLPRP4] AS [slprp4], [SLUOM] AS [sluom], [SLUORG] AS [sluorg], [SLLOCN] AS [sllocn] FROM [JDEDATA920].[F42199] WHERE ([SLUPMJ] >= '{env:start_extraction_date}') AND ([SLUPMJ] <= '{env:end_extraction_date}');

Note If it is required to filter by Companies, the REPLICATE queries for tables that contain Company field must be changed.

Below is a list of tables and their Company field.

Table Company field
F0010 ccco
F03B11 rpkco
F03B14 rzco
F42199 lscoo

Add the IN function to the WHERE clause and put all companies between brackets.

Below is an example for F0010 table: REPLICATE [F0010] SELECT [CCCO] AS [ccco], [CCCRCD] AS [cccrcd], [CCNAME] AS [ccname] FROM [JDEDATA920].[F0010] WHERE [CCCO] IN ('Company1', 'Company2');

Make sure you Save all changes.


Input fields

The following tables include the list of fields per input table, their description, data type to be used when formatting the input and the filter flag to identify those that are being used to filter data.

Input types

Below is an overview of the different field types and their default format settings.

Field type Description
boolean true, false, 1, 0
date CYYDDD
double Decimal separator: . (dot); thousand separator: none
integer Thousand separator: none
text N/A

Notes

Input tables & fields

F0005

Stores user-defined codes and their descriptions.

Field name Description Filter Data type
drsy Product Code. It is a category for grouping different types of codes. text
drrt User Defined Codes. It is the type of code. text
drky User Defined Code. It is the actual code. text
drdl01 Description of the code text

F0006

Stores information about branch, plant, warehouse, and business unit information, such as company, description, and category codes that are assigned to them.

Field name Description Filter Data type
mcmcu Business unit text
mcdl01 Description text
mcrp01 Division text

F0010

Stores company definitions, including number and name, fiscal date pattern, and current period.

Field name Description Filter Data type
ccco Company Optional filter when only a set of companies is required text
cccrcd Currency code from. It is the domestic currency for the company text
ccname Name of the company text

F0014

Stores standard payment terms.

Field name Description Filter Data type
pnptc Payment terms code text
pnptd Description payment terms text

F0015

Stores currency exchange rates.

Field name Description Filter Data type
cxan8 Address number Numeric
cxcrcd Currency code from text
cxcrdc Currency code to text
cxrttyp Currency rate type text
cxeft Date effective time based filter date
cxcrr Currency conversion rate spot rate double

F0092

Stores information about the user.

Field name Description Filter Data type
ulan8 Address number text
uluser User id text

F0101

Stores address book information, including employees, companies, suppliers, and customers.

Field name Description Filter Data type
aban8 Address number text
abalph Name text
abmcu Business unit text
abat1 Search type. It is the category of the Address number (e.g. Customer, employee) text

F0116

Stores addresses, address lines, postal codes, and effective dates for address book numbers.

Field name Description Filter Data type
alan8 Address number text
aleftb Date effective date
alcty1 City text
alcoun County text
aladds State text
alctr Country text

Note: This table must not be filtered by dates as addresses could have effective dates from decades ago and still be valid today

F03012

The Customer Master by line of Business table stores specific information for each client & company combination.

Field name Description Filter Data type
aian8 Address number text
aico Company text
aicpgp Customer price group text
aiac11 Sales region text

F4101

Stores basic information about each item.

Field name Description Filter Data type
imdsc1 Description text
imlitm 2nd item number text

F40203

Stores information about order activity rules, such as order type, line type, last status, and next status. This table contains also descriptions for last statuses.

Field name Description Filter Data type
fsdcto Order type text
fstrty Status line text
fsstds Description status text
fslnty Line type text

F40205

The Line type file contains information related to line types and their descriptions.

Field name Description Filter Data type
lflnty Line type text
lflnds Description line type text

F03B11

The Customer Ledger is the transactional table for invoices which stores information at item level.

Field name Description Filter Data type
rpdicj Date batch date
rpkco Document company Optional filter when only a set of companies is required text
rpdoc Document text
rpsfx Document pay item text
rpdct Document type text
rpdgj Date for GL date
rpvdgj Void date for GL date
rpaap Amount open double
rppost GL posted code Character
rptorg Transaction originator text

Note: the time based filter is done using rpupmj field. The latter is not later used in any logics and that is why is not part of the input file for F03b11

F03B14

The Receipts Detail is the transactional payments table that stores information at item level.

Field name Description Filter Data type
rzpyid Payment ID text
rzrc5 File line identifier text
rzdicj Date batch date
rzkco Document company Optional filter when only a set of companies is required text
rzdoc Document text
rzsfx Document pay item text
rzdct Document type text
rzdctg Document type Journal Entry text
rzcknu Receipt number text
rztorg Transaction originator text
rzupmj Date updated time based filter date

F42199

Historical sales orders table that contains information at item level.

Field name Description Filter Data type
slupmj Date updated time based filter date
sldoco Document text
sllnid Line number text
slkcoo Order company Optional filter when only a set of companies is required text
sldcto Order type text
sltday Time of day integer
sllitm 2nd item number text
slan8 Address number text
slshan Address number ship to text
slaexp Amount extended price double
slfea Amount foreign extended price double
slmcu Business unit text
slemcu Business unit header text
slprp1 Commodity class text
slprp2 Commodity sub class text
slcrcd Currency code from text
slivd Date invoice date
sltrdj Date order transaction date
slrsdj Date promised delivery date
sldrqj Date requested date
slkco Document company text
sldct Document type text
sldoc Document text
slfrth Freight handling code text
sllnty Line type text
sllocn Location text
slprp4 Master planning family text
slptc Payment terms code text
slsrp1 Sales catalog section text
slsrp3 Sales category code 3 text
slsrp4 Sales category code 4 text
slsrp5 Sales category code 5 text
sllttr Status code last text
slnxtr Status code next text
slsrp2 Sub-section text
sltorg Transaction originator text
sluom Unit of measure as input text
sluorg Units order quantity integer
slsobk Units quantity backordered held integer
sldoc Document text
sluser User ID text

Configuring transformations

Seed files

Automation_estimates_raw

This seed file is used to add automation-related properties to each activity, used for the automation potential dashboard. For more information, see Simulating Automation Potential.

Field Type Description
Activity Text Display name for the activity
Event_cost Double Cost associated with the activity
Event_processing_time Integer Processing time associated with the activity (in milliseconds)

Due_dates_configuration_raw

This seed file is used to define properties for the due dates. For more information, see Due Dates.

Field Type Description
Due_date Text The name of the due date
Due_date_type Text The Due date type
Fixed_costs Boolean An indication whether costs are fixed or time based
Cost Double Fixed costs: The amount of costs. Variable costs: The amount of costs per Time and Time_type
Time Integer A number indicating the amount of time in case of time-based costs
Time_type Text Type of time period for cost calculations. This can be any of the following values: day, hour, minute, second or millisecond

Setup_activities

Field Type Description Example
Native_activity_name varchar(50) The name of the activity that is generated by the system 520 - Enter Sales Order
Activity varchar(50) The name of the activity that overwrites the Native_activity_name if the analyst chooses to Create SO item
Activity_category varchar(50) The type of activity that takes place Change, Set block, Remove block or NULL
Automated_flag varchar(1) 1 or (0 or blank or NULL). If 1 then this activity will be considered automated 1

Setup_order_types

There is no specific object for deliveries in JDE. Instead, the same sales order item ID is being used for both purposes. There is an automated logic in the connector to split events that belong to deliveries from those for Sales orders. This split is done by selecting the minimum status of an order for which they become statuses that belong to Delivery item object. If the automated logic does not find the expected value, there is the possibility to manually select that value. This is done by the use of this seed file.

Field Type Description Example
Order_type varchar(2) Order type code. These are defined on each JDE implementation and they can widely vary SO
Line type varchar(2) Type of line of the order. S
Delivery_first_status varchar(3) Minimum status code for which all activities are considered part of Delivery item object 530

Setup_automated_users

There are not always automated flags or ways to use algorithms to find out what activities were made automatically. In this case, you can manually set those users that are deemed automated.

The setup_automated_users.csv seed file contains the information of all automated users that need to be flagged manually.

Field Type Description Example
User_ID varchar(10) User ID as stored in transactional tables H_APS-SC

Dbt variables

Variable Type Description
date_format string Configuration to use pm-utils package. It sets the format for date.
display_currency string Currency code used for all "value" attributes. All amounts (Values) will be converted to the selected display_currency. It is suggested to select the reporting currency code of the client.
material_group_field string Selection of the most representative category field to populate material_group attribute in Sales order items object. Available values are: 'Master_planning_family', 'Commodity_class', 'Commodity_sub_class', 'Sales_catalog_section','Sub_section', 'Sales_category_code_3', 'Sales_category_code_4', 'Sales_category_code_5'.
start_date Minimum date value for values that is accepted (format YYYYMMDD). This variable will be used to validate when a timestamp field has blanks in the Raw files.
event_end_time Time to add to event_end attribute when only dates are available (format HHMMSS).

Design specifications

Objects

Object Transactional tables Master data tables
Sales orders F42199 F0010, F0101, F0005, F0116, F0006, F03012
Sales order items F42199 F0010, F0006, F0005, F0116, F0014, F4101, F0015, F40205
Deliveries F42199 40203
Delivery items F42199 40203
Invoices F03b11, F42199
Invoice items F03b11, F42199
Invoice cancellations F03b11, F42199
Accounting documents F03b11
Payments F03b14

Activities

Automatically generated activities based on F42199 table

The last status and its description are not standard and are mostly customizable for each JDE implementation. Therefore, it is not possible to know the complete list of activities in advance. Also, the same status may mean different activities depending on the order type.

Automatically generated activities based on F03b14 table

Activities generated by the concatenation of the verb Process and the document type of the journal entry.

    'Process ' + F03b14_input."Document_type_journal_entry"
    Example:
        Process Cash Receipts

Default Activities

Activities created based on the transactional tables information. Most of these activities apply to the Accounts Receivable module.

Activity Transactional table Master data table
Enter AR Document F03b11 F0005, F0006, F0101, F0092, Setup_automated_users_input, Setup_activities_input
Void/Reverse AR Document F03b11 F0005, Setup_activities
Process Receipt F03b14 F0005
Void Receipt F03b14 F0005

Note on Activity_order attribute

The last_status_code in JDE is used for tracking the activities that have happened to each order item. These statuses are 3-digit numbers that go from 100 to 999. All statuses greater than 900 represent backorders, line splits, cancellations and purges.

Activity orders are set following different logics according to the module where they were generated.

Sales Management activities

Note that the last status that a record could have in the F42199 historical table is 999 which means 'Ready to Purge'. Based on that information, the Accounts Receivable Module starts with Activity orders equal to '1000' value.

Accounts Receivable activities

Activity Activity order
Enter AR Document 1000
Process Receipt 1200
Concatenation of 'Process' and < Document type description > 1200
Void Receipt 1400
Void/Reverse AR Document 1500

Design details

Limitations

Known issues

Common problems