Using Temp Tables

This Q&A forum allows users to post and respond to "How Do I Do ....." questions. Please do not use to report (suspected) errors - you must use your regional help desk for this. The information contained in this forum has not been validated by LANSA and, as such, LANSA cannot guarantee the accuracy of the information.
Post Reply
jyoung
Posts: 640
Joined: Thu Jan 21, 2016 6:43 am
Location: Oklahoma City, OK USA

Using Temp Tables

Post by jyoung » Thu Aug 09, 2018 12:20 am

I've an issue where I have to collect a large amount of data from different tables and be able to page and sort through it. Unfortunately joining all the data will not work due to how the data is organized.

The only way I can think of handling this through the use of a temp or secondary table. My thought is to collect the data and pump it into the temp table, then I can deal with the sorting and paging of the data set.

I can't find anything in the docs regarding temp tables, so perhaps LANSA does not support it? I am also thinking to create a physical table to act like a temp table, which means I have to worry about clean it out. My concern here is the performance cost of inserting all that data one record at a time.

That leads me to doing something like a "insert into ... select ..." which is how I would do it in the non-LANSA world.

Curious as to what other folks are doing regarding temp tables.

KevinW
Posts: 28
Joined: Thu May 26, 2016 11:18 am

Re: Using Temp Tables

Post by KevinW » Thu Aug 09, 2018 9:17 am

Hi Joe,

How many rows are you looking at, and how many bytes per row"?

The first thing that occurs to me is a working list. It can be sorted.

Next is the SPACE BIFs.

Larger than they can handle may need a temporary file as you say.

If IBM i specific and not web, you could have an empty version in the default file library and EXEC_OS400 to CRTDUPOBJ with DATA(*NO) it to the QTEMP library ignoring errors, then EXEC_OS400 to CLRPFM it and ensure that the QTEMP library is ahead of the default file library in the job library list. This will ensure that it is empty to start with. At the end of the job it will be deleted because QTEMP is deleted when a job on IBM i ends.

If not IBM i specific or web, you have the same file but with an additional first key column with a value that is unique to your "session" / job and start off by deleting all rows with that key. All I/O to that file is with that value as the first key. You finish off by deleting all rows with that key.

jyoung
Posts: 640
Joined: Thu Jan 21, 2016 6:43 am
Location: Oklahoma City, OK USA

Re: Using Temp Tables

Post by jyoung » Thu Aug 09, 2018 11:21 pm

Hi Kevin,

I ended up doing something like you suggested with a working list.
I was concerned about the paging aspect, so used to doing it in the database, but it turned out easier then expected.
Seems to be working pretty well.

I am not familiar with the SPACE BIF, will look into that in case something like this comes up again.

Using CRTDUPOBJ in QTEMP is the path I was heading down, but I think the working list worked out better.

Thanks!

atostaine
Posts: 388
Joined: Wed Jan 20, 2016 7:38 am

Re: Using Temp Tables

Post by atostaine » Fri Aug 10, 2018 12:23 am

CRTDUPOBJ will hang up if the source object is in use. We've used CPYF Source qtemp/dest crtfile(*YES) torcd(1), CLRPFM QTEMP/DEST

Doing it in a working list is probably faster and keeps you all in LANSA.

Art
Art Tostaine

Post Reply