Paul Jackson | 20 Feb 00:47 2010
Picon

Best way to convert "system timestamp format"

Hello,

What is the current recommended method of converting a "system
timestamp" date (8 bytes) into a regular RPG native timestamp field?
I've seen a bunch of date and time API's/MI instructions out there so
was curious as to the best method to use.  I will be doing this
conversion many hundreds of thousands of times and so would be looking
for the quickest method (assuming there is a big difference between
them).

I've used QWCCVTDT before, should I just stick with that?

Thanks in advance!
--

-- 
This is the RPG programming on the IBM i / System i (RPG400-L) mailing list
To post a message email: RPG400-L@...
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@...
Before posting, please take a moment to review the archives
at http://archive.midrange.com/rpg400-l.

Simon Coulter | 20 Feb 02:22 2010
Picon

Re: Best way to convert "system timestamp format"


On 20/02/2010, at 10:47 AM, Paul Jackson wrote:

> What is the current recommended method of converting a "system
> timestamp" date (8 bytes) into a regular RPG native timestamp field?
> I've seen a bunch of date and time API's/MI instructions out there so
> was curious as to the best method to use.  I will be doing this
> conversion many hundreds of thousands of times and so would be looking
> for the quickest method (assuming there is a big difference between
> them).
>
> I've used QWCCVTDT before, should I just stick with that?

That's the easiest method but is likely slower than the MI version.  
You would be trading ease for speed. The only way to know for sure is  
to perform empirical tests but I suspect the call overhead to the API  
will be measurable over "many hundreds of thousands" of iterations.

If you use the MI built-in _CVTD then you'll need to build appropriate  
DDAT structures for the input and output dates. Not difficult but lots  
of sub-fields to set and offsets to calculate correctly.

Search the archives for additional information on DDAT structures.

Regards,
Simon Coulter.
--------------------------------------------------------------------
    FlyByNight Software         OS/400, i5/OS Technical Specialists

    http://www.flybynight.com.au/
(Continue reading)

Paul Jackson | 23 Feb 00:12 2010
Picon

Re: Best way to convert "system timestamp format"

On Fri, Feb 19, 2010 at 5:22 PM, Simon Coulter
<shc@...> wrote:
>
> On 20/02/2010, at 10:47 AM, Paul Jackson wrote:
>
> > What is the current recommended method of converting a "system
> > timestamp" date (8 bytes) into a regular RPG native timestamp field?
> > I've seen a bunch of date and time API's/MI instructions out there so
> > was curious as to the best method to use.  I will be doing this
> > conversion many hundreds of thousands of times and so would be looking
> > for the quickest method (assuming there is a big difference between
> > them).
> >
> > I've used QWCCVTDT before, should I just stick with that?
>
> That's the easiest method but is likely slower than the MI version.
> You would be trading ease for speed. The only way to know for sure is
> to perform empirical tests but I suspect the call overhead to the API
> will be measurable over "many hundreds of thousands" of iterations.
>
> If you use the MI built-in _CVTD then you'll need to build appropriate
> DDAT structures for the input and output dates. Not difficult but lots
> of sub-fields to set and offsets to calculate correctly.
>
> Search the archives for additional information on DDAT structures.

Thanks Simon, I will search further.
--

-- 
This is the RPG programming on the IBM i / System i (RPG400-L) mailing list
To post a message email: RPG400-L@...
(Continue reading)

Vern Hamberg | 23 Feb 19:11 2010
Picon
Picon

Re: Best way to convert "system timestamp format"

A lot of different terms are used, it seems, for the same things. So I'm 
wondering about "system timestamp". Are these the same thing?

1. TOD value returned by the MATTOD MI function
2. System clock as discussed in documentation for CVTD MI function
3. *DTS as discussed in documentation for QWCCVTDT API

So far as I can tell, TOD and *DTS are the same, since the range of 
dates is the same - back to somewhere in 1928 and up through somewhere 
in 2071

Thanks
Vern

Paul Jackson wrote:
> On Fri, Feb 19, 2010 at 5:22 PM, Simon Coulter
<shc@...> wrote:
>   
>> On 20/02/2010, at 10:47 AM, Paul Jackson wrote:
>>
>>     
>>> What is the current recommended method of converting a "system
>>> timestamp" date (8 bytes) into a regular RPG native timestamp field?
>>> I've seen a bunch of date and time API's/MI instructions out there so
>>> was curious as to the best method to use.  I will be doing this
>>> conversion many hundreds of thousands of times and so would be looking
>>> for the quickest method (assuming there is a big difference between
>>> them).
>>>
>>> I've used QWCCVTDT before, should I just stick with that?
(Continue reading)

Bruce Vining | 23 Feb 20:53 2010

Re: Best way to convert "system timestamp format"

Those terms all refer to a common format for time. From the Information
Center:

The *Standard Time Format* is defined as a 64-bit (8-byte) unsigned binary
value as follows:
  Offset
 Dec Hex
Field Name
Data Type and Length  0 0
Standard Time Format
UBin(8)  0 0
Time
Bits 0-48  0 0
Uniqueness bits
Bits 49-63  8 8
--- End ---

The *time* field is a binary number which can be interpreted as a time value
in units of 8 microseconds. A binary 1 in bit 48 is equal to 8 microseconds.

The *uniqueness bits* field may contain any combination of binary 1s and 0s.
These bits do not provide additional granularity for a time value; they
merely allow unique 64-bit values to be returned, such as when the value of
the *time-of-day (TOD) clock* is materialized.

A number of MI instructions define fields to contain a binary value which
may represent a time stamp or time interval, or may specify a wait time-out
period. Unless explicitly stated otherwise, the format of the field is
the *Standard
Time Format*.
(Continue reading)

Bruce Vining | 23 Feb 21:44 2010

Re: Best way to convert "system timestamp format"

I should point out that what I attached was V5R4 documentation. In V6R1 the
standard time format provides time with 52 bits, giving 1 microsecond
granularity. (I really need to stop using the V5R4 Info Center...)

On Tue, Feb 23, 2010 at 1:53 PM, Bruce Vining <bvining@...>wrote:

> Those terms all refer to a common format for time. From the Information
> Center:
>
>
> The *Standard Time Format* is defined as a 64-bit (8-byte) unsigned binary
> value as follows:
>   Offset
>  Dec Hex
> Field Name
> Data Type and Length  0 0
> Standard Time Format
> UBin(8)  0 0
> Time
> Bits 0-48  0 0
> Uniqueness bits
> Bits 49-63  8 8
> --- End ---
>
> The *time* field is a binary number which can be interpreted as a time
> value in units of 8 microseconds. A binary 1 in bit 48 is equal to 8
> microseconds.
>
> The *uniqueness bits* field may contain any combination of binary 1s and
> 0s. These bits do not provide additional granularity for a time value; they
(Continue reading)


Gmane