Re: FTP of large store file issue (329 Views)
Reply
Occasional Contributor
ITTekk
Posts: 4
Registered: ‎03-01-2010
Message 1 of 7 (329 Views)

FTP of large store file issue

Hello, I am having an issue with the ftp of a large >4gb store file from an hp3k server to a unix server. We ftp the the store file from the hp3k to the unix server and then back to the hp3k. I am setting all the file attributes in the get command including bytestream. When the file lands on the hp3k it looks correct with the exception of the sectors which is off by about 80 sectors. Upon restore I get the error 'store/restore was unable to open permanent disk file'.

i do the exact process with a smaller ~250mb file and it works correctly. the os is mpeix 7.5 with the latest arpa and ftp patches. i have even tried pushing the file back from the unix server with the same results.

any suggestions?
Respected Contributor
Stan Sieler
Posts: 321
Registered: ‎10-16-1996
Message 2 of 7 (329 Views)

Re: FTP of large store file issue

First, you can't have a bytestream file on the 3000
that's larger than 2 GB.

Second, can you please post the output of
"listf,2" (or "listfile , 2") for the original (good) and
final (failing) versions of the files?

thanks,

Stan
Occasional Contributor
ITTekk
Posts: 4
Registered: ‎03-01-2010
Message 3 of 7 (329 Views)

Re: FTP of large store file issue

Hello,

I have done further testing of this issue. I have created a 2.8gb backup as i figured i would try to see if the issue was roll to the second backup file. interestingly i have the exact same problem even with the single 2.8gb file. i have transferred the file in bytestream and binary mode with the exact same results. i am currently running the ftp again and will provide the results once that completes. a listfile,2 shows that both files are identical which i will provide shortly.
Respected Contributor
Stan Sieler
Posts: 321
Registered: ‎10-16-1996
Message 4 of 7 (329 Views)

Re: FTP of large store file issue

I'd be rather surprised ... ok, shocked ...
if you see the incoming FTP as a *BYTESTREAM*
file of more than 2 GB. The maximum
number of records for ANY file is 2**31-1,
or 2147483647. Since the record size
for bytestream files is, by definition, 1,
that's 2 GB (minus one byte).

Now, that said, I suspect that the problem
is (a) you're creating a bytestream file
when uploading to the 3000, and (b) you're
not creating a file with filecode 2501.

Here's the essential info for FTP'ing a
STORE-to-disk file to a 3000:

put localname MPENAME;rec=128,1,f,binary;code=2501

E.g., if the local (Unix/Linux/Windows/Mac) filename is foo.sd, and you want to upload
to "FOO" on MPE:

put foo.sd FOO;rec=128,1,f,binary;code=2501

For some, perhaps most, files you may have
to specify a "disc=" too:

put foo.sd FOO;rec=128,1,f,binary;code=2501;disc=1000

To choose the disk= value, take the size
of the file in bytes, divide by 256, and
add at least one to that result and use
that number as the number of records (disc=).

Unlike Randy Medd's excellent LZW utility,
STORE is very picky...the STORE-to-disk
file must have filecode 2501, and must
be a fixed-record file (there may be other record sizes that work, but STORE
prefers rec=128).

Stan
Respected Contributor
Stan Sieler
Posts: 321
Registered: ‎10-16-1996
Message 5 of 7 (329 Views)

Re: FTP of large store file issue

ITTekk...
you wrote: a listfile,2 shows that both files are identical which i will provide shortly.

You might want to do that soon, before people
lose interest.

If/when you get an answer, or lose interest yourself, don't forget to assign points.

Stan
Occasional Contributor
ITTekk
Posts: 4
Registered: ‎03-01-2010
Message 6 of 7 (329 Views)

Re: FTP of large store file issue

hello and thanks for all the input. after working this issue for the past two days i figured out the issue. the comment on bytestream files is dead on. i changed the transfer to binary. the code was already set to 2501. afterwards i still had issues. i finally figured out that the only way to get this to work with store files is to also set the disc=16776959.

i am now happily ftp'ing the files. now if only i can figure out how to do an mget of the ftp'd files from the ftp server rather than retrievig them individually i will be a happy camper.
Respected Contributor
Stan Sieler
Posts: 321
Registered: ‎10-16-1996
Message 7 of 7 (329 Views)

Re: FTP of large store file issue

"how to mget" ...

if you're the only user of FTP (at least
during this time), you could try
editing the BLDPARMS file to specify
the record size and binary and disc=
parameters globally. Then do the MGET
then restore the original BLDPARMS file.

Or, use a session-local BLDPARMS file.

Do: :print bldparms.arpa.sys
and read it for info.
(Note: I have not used BLDPARMS myself)

Don't forget to assign points :)

thanks,
Stan
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.