Jump to content
  • 0

Sftp Backups Not Working


Jonathon

Question

Hey Everyone

 

I just started looking at using the automated backup in Blesta.

 

I have created and tested the sFTP account to anther server and that works just fine.

 

 

The web user can execute the mysqldump - I added some debug code and su'd to the user to test the command line produceds and it worked just fine.

 

What is happening is the system is getting a gateway time out on force or download backup via the web.

 

 

When I stop the process in the buildDump()  I can see the database.sql file and it is 100% good.

When I let it try and compress the files this is where is goes wrong. 

 

I get a zero byte .gz file and it just times out.

 

 

If I comment out the .gz of the file - and just have the function return the uncompressed sql file

sFTP fails - no additional information given as to why.

 

but when I do a check sFTP connection in the settings that comes back as good.

 

Any thoughts or help would be appreciated :)

 

 

Jonathon M

OnehostingPlan.com

 

 

 

Link to comment
Share on other sites

15 answers to this question

Recommended Posts

  • 0

If I comment out the gz of the file I do not get the nginx timeout and I can see the .sql file

 

The core backup.php file is the same -

 

By letting it not do the gz  I can see the mysql dump has completed but looks to be hanging on the gz function.

 

If I return with the .sql file as the file variable the sftp upload then fails.   but if I do an ondemand download I can get the .sql file but not a .gz file 

 

Hope that helps -  Just trying to break down the process to see where and why it is failing.

 

Summary 2 parts

 

1) the sql to gz creates a zero byte file

2) sftp fails with .sql file with no other details.

Link to comment
Share on other sites

  • 0

If I comment out the gz of the file I do not get the nginx timeout and I can see the .sql file

 

The core backup.php file is the same -

 

By letting it not do the gz  I can see the mysql dump has completed but looks to be hanging on the gz function.

 

If I return with the .sql file as the file variable the sftp upload then fails.   but if I do an ondemand download I can get the .sql file but not a .gz file 

 

Hope that helps -  Just trying to break down the process to see where and why it is failing.

 

Summary 2 parts

 

1) the sql to gz creates a zero byte file

2) sftp fails with .sql file with no other details.

 

 

So you're editing core files to get it to work on your server? I would use a better webhost.

Link to comment
Share on other sites

  • 0

Check your webserver logs, if the database is too large you may have to tweak settings to compensate for the time it takes for blesta to execute the command.

 

You should have the domain setup with it's own error log file under Nginx.

 

To be honest it might be better to create a shell script that does the backup, and rsync over SSH. I'm looking to implement this on my setup since I'm having issues with Blesta automated backups.

Link to comment
Share on other sites

  • 0

Hi Paul,

 

If i change the code in the /app/model/backup.php

private function buildDump() {
                $db_info = Configure::get("Database.profile");

                Loader::loadComponents($this, array("SettingsCollection"));
                $temp = $this->SettingsCollection->fetchSystemSetting(null, "temp_dir");
                $temp_dir = (isset($temp['value']) ? $temp['value'] : null);

                $this->Input->setRules($this->getRules());

                $vars = array(
                        'temp_dir' => $temp_dir,
                        'db_info' => $db_info
                );

                if ($this->Input->validates($vars)) {
                        // ISO 8601
                        $file = $db_info['database'] . "_" . date("Y-m-d\THis\Z");
                        $file_name = $temp_dir . $file . ".sql";

//                      $test_command = "mysqldump --host=" . escapeshellarg($db_info['host']) . " --user=" . escapeshellarg($db_info['user']) ." --password=" . escapes$
//echo $test_command;
//file_put_contents("mysql_dump.txt", $test_command);

                        exec("mysqldump --host=" . escapeshellarg($db_info['host']) . " --user=" . escapeshellarg($db_info['user']) .
                                " --password=" . escapeshellarg($db_info['pass']) . " " . escapeshellarg($db_info['database']) . " > " .
                                escapeshellarg($file_name));

                        // GZip the file if possible
                        if (function_exists("gzopen")) {
                                $chunk_size = 4096;
                                $compress_file_name = $file_name . ".gz";
                                // Compress as much as possible
                                $gz = gzopen($compress_file_name, "w9");
                                $fh = fopen($file_name, 'rb');

                                // Read from the original and write in chunks to preserve memory
                //              while (!feof($fh)) {
                //                      $data = fread($fh, $chunk_size);
                //                      if ($data)
                //                              gzwrite($gz, $data);
                //              }
                //              unset($data);
                                gzwrite($gz, $fh);

                                $compressed = gzclose($gz);

                                // Remove the original data file
                                if ($compressed) {
                                        unlink($file_name);
                                        return $compress_file_name;
                                }
                        }

                        return $file_name;
                }
        }

I does work as expected.   I know long term issues with reading the full file into memory with out chunking it will cause an issue.

 

Not looking to modify the core - but just understand what part of the function is failing.

 

Please note: maybe part of a bug but the function being checked for is not the same function being used to do the compress.

 

Yes I can tar files in command with no issue :) even if I su to the www user running the site.

 

Thanks for looking :)

Link to comment
Share on other sites

  • 0
11 minutes ago, Kurogane said:

I've the same issue i'm using  PHP 5.6.

When i click force offsite backup give me fail if i click download backup give me a backup without an issue.

There is anyway to do backup via clic to see why is failing? webserver and php logs errors not show nothing.

Does the test settings work?

Link to comment
Share on other sites

  • 0

I would enable error reporting in /config/blesta.php (change error reporting from "0" to "-1" value). Then, disable the cron temporarily, and run it manually when the next backup should be processed. Settings > System > Automation. It may output more errors. If the issue is only when your CLI/cron environment runs it, and not via Web, then you can execute the same cron command via SSH/CLI instead.

Link to comment
Share on other sites

  • 0

After a day and run cron report not errors and the SFTP give me a zero  byte .gz file.

 

Attempting to run all tasks for Company.
Attempting to renew services and create invoices.
The create invoices task has completed.
Attempting to apply credits to open invoices.
There are no invoices to which credits may be applied.
The apply credits task has completed.
Attempting to auto debit open invoices.
The auto debit invoices task has completed.
Attempting to deliver invoices scheduled for delivery.
No invoices are scheduled to be delivered.
The deliver invoices task has completed.
Attempting to provision paid pending services.
The paid pending services task has completed.
Attempting to suspend past due services.
The suspend services task has completed.
Attempting to unsuspend paid suspended services.
The unsuspend services task has completed.
Attempting to cancel scheduled services.
The cancel scheduled services task has completed.
Attempting to process service changes.
The process service changes task has completed.
Attempting to process renewing services.
The process renewing services task has completed.
Attempting to send payment reminders.
The payment reminders task has completed.
Attempting plugin cron for order accept_paid_orders.
Finished plugin cron for order accept_paid_orders.
Attempting plugin cron for support_manager poll_tickets.
Finished plugin cron for support_manager poll_tickets.
Attempting plugin cron for support_manager close_tickets.
Finished plugin cron for support_manager close_tickets.
Attempting to clean up old logs.
0 old Gateway logs have been deleted.
0 old Module logs have been deleted.
The clean logs task has completed.
All tasks have been completed.
Attempting to run all system tasks.
Attempting to validate the license.
The license validation task has completed.
Attempting to backup the database to AmazonS3.
The backup completed successfully.
The AmazonS3 database backup task has completed.
Attempting to backup the database via SFTP.
The backup completed successfully.
The SFTP database backup task has completed.
All system tasks have been completed.

670a856a9e.png

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...