Welcome!

By registering with us, you'll be able to discuss, share and private message with other members of our community.

SignUp Now!

Split 174 Meg file into parts

Apr
1,794
15
I didn't see that TCC had a split / cut function but does any of the plugins have it?

Have to split it into parts as Sktpe can't send that large a file...
 
This is not a TCC solution,
but you can use the 7zip File Manager (7zfm.exe) to split and combine files.

I just split windows.iso ( 4,600,823,808 bytes) into;

Windows.iso.005 406,519,808 2023-08-01 14:46:01
Windows.iso.004 1,048,576,000 2023-08-01 14:45:54
Windows.iso.003 1,048,576,000 2023-08-01 14:45:37
Windows.iso.002 1,048,576,000 2023-08-01 14:45:20
Windows.iso.001 1,048,576,000 2023-08-01 14:45:04


...and combined them again.

Joe

1690916005045.png

1690915981156.png

1690915897639.png

1690916046233.png
 
TPIPE does have the Split option, but I think that is only for text files, not binary files.

Is the file you want to split binary or text?

Joe

1690916998478.png
 
You can use split in WSL.

Code:
v:\> d 1m*
2023-07-10  11:32       1,048,576  1meg.rnd

v:\> wsl cd /v ; split -n 3 /v/1meg.rnd

v:\> d xa?
2023-08-01  15:44         349,525  xaa
2023-08-01  15:44         349,525  xab
2023-08-01  15:44         349,526  xac

That said, split seems rather lame. You can't specify names for the pieces, only the prefix. The default prefix is 'x' and you get xaa, xab, xac, .... And the pieces go to the current WSL directory; hence the cd in the example above. And I have no idea how large a file it can handle.
 
I don't, but it might be a fun project. As Joe asked, are you looking to split binary files or text files? By lines or bytes?
 
Hi all,

Code:
 8/01/2023  13:00     179,837,680  20230801_113528.mp4
 
Hi all,

Code:
 8/01/2023  13:00     179,837,680  20230801_113528.mp4

Maybe a

[c:\path\] split /nPARTS filename.ext or wildcard

would split the flename.ext into PARTS parts - where PARTS is a number....
 
I don't, but it might be a fun project. As Joe asked, are you looking to split binary files or text files? By lines or bytes?
This plugin took less than 2 hours to write (and it will probably take a day to polish it, add error control, and test it more than once).

Code:
v:\> help split

SPLIT N (pieces) file_name

v:\> split 5 1meg.rnd

v:\> d 1m*
2023-07-10  11:32       1,048,576  1meg.rnd
2023-08-01  17:32         209,716  1meg.rnd_1
2023-08-01  17:32         209,716  1meg.rnd_2
2023-08-01  17:32         209,716  1meg.rnd_3
2023-08-01  17:32         209,716  1meg.rnd_4
2023-08-01  17:32         209,712  1meg.rnd_5

I thought I had a working plugin template for Visual Studio; I have one but I wouldn't call it working. So I don't have a good mechanism for making ad hoc plugins. Charles, if you want the barebones code that produced the example above, just say so.
 
You know, @Charles G, specifying the piece size instead of the number of pieces might be better ... because ...
  1. I reckon you're probably wanting to limit the size, and
  2. It'll be easier to write without having to do an awkward computation to get the piece size from the number of pieces
Eh?
 
If it's by piece size - what would the computation be?
 
If it's by piece size - what would the computation be?
For me, with piece_size specified, there is no computation; I don't need to know n_pieces. I'd just read piece_size and write however much was read (which, for the last piece might be less than the piece_size) It'll be done when the reading produces 0 bytes.

When n_pieces is specified, I have to do a bit of a search to get the piece_size.

If YOU want to get n_pieces from piece_size, I think this will do it.

n_pieces = ceil(file_size / piece_size).
 
I thought I had a working plugin template for Visual Studio; I have one but I wouldn't call it working. So I don't have a good mechanism for making ad hoc plugins. Charles, if you want the barebones code that produced the example above, just say so.

Thanks for the offer, but it's more fun to do my own....
 
Here's an EXE to try. Get it with the command COPY ftp://vefatica.net/split.zip.

Code:
SPLIT /P|/S N filename

    Split file into pieces without regard for content

    N = number of pieces (/P) or piece size in bytes (/S)

    filesize < 2^32; 2 <= N <= filesize/2

Notes: It won't process input files bigger than 2^32 bytes (4GB) and it might choke if the output piece is bigger than 1/4 GB (which is it's heap reserve). It's easy to get a ton of pieces (/P with a large number or /S with a small number) so be careful. The pieces have suffixes like, for example, 001 ~ 999, using the appropriate number of characters. On NTFS, FindFirstFile/FindNextFile should return them in the correct order. So you should be able to reassemble the original file with the likes of COPY /B piece_* reassembly.

Here's a bit of a test.

Code:
v:\> d 1g*
2023-08-02  14:52   1,073,741,824  1gig.rnd

v:\> split /p 5 1gig.rnd

v:\> d 1g*
2023-08-02  14:52   1,073,741,824  1gig.rnd
2023-08-02  14:53     214,748,365  1gig.rnd_1
2023-08-02  14:53     214,748,365  1gig.rnd_2
2023-08-02  14:53     214,748,365  1gig.rnd_3
2023-08-02  14:53     214,748,365  1gig.rnd_4
2023-08-02  14:53     214,748,364  1gig.rnd_5

v:\> copy /b 1gig.rnd_* wholefile.rnd
V:\1gig.rnd_1 => V:\wholefile.rnd
V:\1gig.rnd_2 =>> V:\wholefile.rnd
V:\1gig.rnd_3 =>> V:\wholefile.rnd
V:\1gig.rnd_4 =>> V:\wholefile.rnd
V:\1gig.rnd_5 =>> V:\wholefile.rnd
     5 files copied

v:\> echo %@compare[1gig.rnd,wholefile.rnd]
1
 
Hmmm? I use Firefox. If I right_click\save_link_as I get a 0-byte file (as when I just double-click). If I visit Index of /dl/beta and double-click on fileutils.zip, it downloads correctly. TCC's COPY also works.
 
Hmmm? I use Firefox. If I right_click\save_link_as I get a 0-byte file (as when I just double-click). If I visit Index of /dl/beta and double-click on fileutils.zip, it downloads correctly. TCC's COPY also works.

Out of curiosity, what happens if you copy the link address and paste it into the address bar, instead of clicking on it?
 
I get this

Code:
v:\> rndmfile 4gig.rnd %@eval[2**32-1]

v:\> bsplit 4gig.rnd
FileUtils plugin: Can’t get file size for "D:\work\4gig.rnd"

The size is 4,294,967,295.

Then I tried this. I couldn't find any output files. Are there no how_to_split parameters?

Code:
v:\> rndmfile 1meg.rnd %@eval[2**20]

v:\> bsplit 1meg.rnd
   *  FileSize = 1048576
   *  Opened input file 'D:\work\1meg.rnd'.
 
Major goof on my part. I've re-zipped and re-uploaded; would you mind trying again?
 
Major goof on my part. I've re-zipped and re-uploaded; would you mind trying again?
That's better.

Code:
v:\> bsplit /n:2 1meg.rnd
Creating file "v:\1meg.000"
Creating file "v:\1meg.001"

v:\> d 1m*
2023-08-02  16:38         524,288  1meg.000
2023-08-02  16:38         524,288  1meg.001
2023-08-02  16:37       1,048,576  1meg.rnd

Why can't you get the size when it's 2^32-1? GetFileSize() works here.
 
I'm using QueryFileSize(). I should use GetFileSize() instead!
Looks like it should work.

Code:
__int64 WINAPI QueryFileSize( LPCTSTR pszName, int fAllocated );
/*
    Returns the file size for the file pszName, or -1 if it doesn't exist.
    fAllocated = if != 0, return the allocated size.
*/

Interesting, none of my plugins use QueryFileSize().
 
I don't seem to have any files >= 4GB to test with, other than C:\hiberfil.sys which can't be opened. Vincent, what is rndmfile? Is that in one of your plugins?
 
I'm using QueryFileSize(). I should use GetFileSize() instead!
I could make a guess. If QueryFileSize() uses GetFileSize() ... When GetFileSize() fails and you have asked for dwFileSizeHigh, it returns 2^32-1 (INVALID_FILE_SIZE). But that's only an error if (also) GetLastError() != NO_ERROR. It could be QFS() isn't checking that additional condition.
 
I could make a guess. If QueryFileSize() uses GetFileSize() ... When GetFileSize() fails and you have asked for dwFileSizeHigh, it returns 2^32-1 (INVALID_FILE_SIZE). But that's only an error if (also) GetLastError() != NO_ERROR. It could be QFS() isn't checking that additional condition.

Could be. I haven't spent much time with it. One of those convenience functions that you stop using when they cease to be convenient.
 
Back
Top