Sitecopy refuses to update a file

I am using the software “sitecopy” for quite a well and find it extremely useful, much better than any ftp-client. With it, I maintain several web pages quite easily. I encountered the following problem: After I accidentally removed a file on the server, I was unable to upload the same file again. Sitecopy threw a 550 error. Eventually I decided to recreate the file directly on the server, which worked. But then I needed to sync sitecopy’s local storage with the server’s storage. Since this is not so easy with sitecopy (from my perspective - the man pages warn not to do a sync if the web files exist locally, and I am pretty sure that is a risk, because there are files locally which are not to be on the server, and sitecopy might remove those files automatically). I don’t see an option to sync a single file, so I just did a small change to the file which had no effect on the website and wanted to upload that with the command sitecopy -u website (where “website” is the name of the website given in the sitecopy config file). However, now I get the following message:

file was changed on the server - not overwritten with local changes

Now I don’t know how to proceed, because I need to change the file in order to do improve performance of the website, and can’t upload the changes. Does anyone have an idea on how to do that?
NOTE: sitecopy is an old software and the author has abandoned it, from what I see. However, it functions perfectly, and I don’t want to give it up in order to use an ftp client.
So any help would be appreciated.

Did you check the man page (https://linux.die.net/man/1/sitecopy)

I believe, that maybe --fetch updates the internal state?

(make a backup first)

You might have to modify this one file locally again and update again.

  1. Thank you very much for your quick reply!

  2. Yes, I read the man page (and I mentioned that in my post)

  3. I was suspicious of the --fetch option for a good reason, it seems: it fetched the list of files, including some files in a folder which are generated automatically by the server and hence are usually not in the list of files that are to be uploaded/updated. I did the fetch anyway, and after that did a sitecopy -u website, which resulted in the deletion of all those files which were server-side generated, because I don’t have them locally. But it also resolved the problem with the other particular file mentioned in my first post. I hope very much that sitecopy ignores those files which it deleted in the future.

If anyone has an idea of how to avoid such an unwanted deletion (I would like to keep those files), I’ll be glad to hear it!

The best way forward would be moving away from this outdated tool and switch to another one.

Maybe rsync would be best?