i got a full back up of my site and its size is 70gb my vps have 100gb max ... i dont want the heavy files cause just are mp3 or videos that download with torrent , but want my file scripts ...
how can i do that
?
big thnks tar -xvf site-backup-archive-file.tar(.gz) script-folder-name
that's it
Highest Regards
Mohammed H i enter the command and seems like was it working but i waited some minutes and nothing happens that means you screwed up |: ok nice very helpful no its not screwed up but I assume it will need to read the whole archive before it extract the files.
also I have added -v to command so you can see which files are being extracted.
Highest Regards
Mohammed H Quote: Originally Posted by XSLTel no its not screwed up but I assume it will need to read the whole archive before it extract the files.
also I have added -v to command so you can see which files are being extracted.
Highest Regards
Mohammed H
soved but im doing in different way
i open two sessions in shh
1.- for extract all files 2.- to delete all the unnecessary spaces with rm -rf command to delete files faster
how can i do that
?
big thnks tar -xvf site-backup-archive-file.tar(.gz) script-folder-name
that's it
Highest Regards
Mohammed H i enter the command and seems like was it working but i waited some minutes and nothing happens that means you screwed up |: ok nice very helpful no its not screwed up but I assume it will need to read the whole archive before it extract the files.
also I have added -v to command so you can see which files are being extracted.
Highest Regards
Mohammed H Quote: Originally Posted by XSLTel no its not screwed up but I assume it will need to read the whole archive before it extract the files.
also I have added -v to command so you can see which files are being extracted.
Highest Regards
Mohammed H
soved but im doing in different way
i open two sessions in shh
1.- for extract all files 2.- to delete all the unnecessary spaces with rm -rf command to delete files faster