Bug 978344 - Mysql big data snapshot restore failed
Summary: Mysql big data snapshot restore failed
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Containers
Version: 1.2.0
Hardware: Unspecified
OS: Unspecified
medium
medium
Target Milestone: ---
: ---
Assignee: Jason DeTiberus
QA Contact: libra bugs
URL:
Whiteboard:
Depends On: 970914
Blocks:
TreeView+ depends on / blocked
 
Reported: 2013-06-26 12:13 UTC by nsun
Modified: 2017-03-08 17:35 UTC (History)
6 users (show)

Fixed In Version: openshift-origin-cartridge-mysql-0.3.7-1.el6op
Doc Type: Bug Fix
Doc Text:
Clone Of: 970914
Environment:
Last Closed: 2013-06-28 15:45:29 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)

Description nsun 2013-06-26 12:13:29 UTC
+++ This bug was initially created as a clone of Bug #970914 +++

Description of problem:
Snapshot save big data(500000 rows) and restored it. Then there will be no data in mysql.

Version-Release number of selected component (if applicable):
Devenv_3319

How reproducible:
always

Steps to Reproduce:
1. create an php app and embed mysql cartridge to it
# rhc app create php php-5.3 -predhat
# rhc cartridge add mysql-5.1 -a php -predhat
2.ssh longin app ,create tables 'info'
# ssh 51aedd014c83d2073f000001.rhcloud.com
>mysql 
>use php
> CREATE TABLE IF NOT EXISTS info(id INT NOT NULL AUTO_INCREMENT, datt
a CHAR(200), PRIMARY KEY (id));
3. use script insert 500000 rows data to table 'info'
while (i<500000);do
    mysql  -Dphp -e "INSERT INTO info VALUES(NULL, 'This is testing data for testing snapshoting and restoring big data in mysql database.');"
done
4. Snapshot save app
# rhc snapshot save php -predhat
5. Login mysql from app's cartridge , delete info table
# ssh 51aedd014c83d2073f000001.rhcloud.com
>mysql -Dphp -e "drop table info"
6. Snapshot restore app
#rhc snapshot restore app -f php.tar.gz -predhat
7. Login mysql to check if the data is restored

Actual results:
table 'info' have not found.

Expected results:
Table restore should be successed,and 500000 rows data existed in mysql

Additional info:
I tried write 10 rows data to mysql, snapshot save and restore is successed

--- Additional comment from Fotios Lindiakos on 2013-06-18 16:28:08 EDT ---

I am trying to reproduce this. This script will create the data quicker because it's only making a single mysql call.

ruby -e "1000000.times{puts 'INSERT INTO info VALUES(NULL,\'This is testing data for testing snapshoting and restoring big data in mysql database.\');'}" > /tmp/insert.sql && mysql -D $OPENSHIFT_APP_NAME < /tmp/insert.sql

--- Additional comment from Fotios Lindiakos on 2013-06-18 17:17:03 EDT ---

I've modified the mysql dump to be a little smaller using --extended-insert as well as speeding up the restore by turning off autocommit and some data checks. This is working in my manual testing. I will run the extended database tests and then submit a pull request.

--- Additional comment from Fotios Lindiakos on 2013-06-18 18:50:03 EDT ---

Testing this PR now: https://github.com/openshift/origin-server/pull/2891

--- Additional comment from Fotios Lindiakos on 2013-06-18 20:15:36 EDT ---

PR is currently merging, should be available for testing shortly

--- Additional comment from openshift-github-bot on 2013-06-18 21:28:00 EDT ---

Commit pushed to master at https://github.com/openshift/origin-server

https://github.com/openshift/origin-server/commit/24fbcce38965191489916c30736927300350eeca
Bug 970914 - Mysql big data snapshot restore failed

--- Additional comment from nsun on 2013-06-19 04:22:00 EDT ---

Verfied on devenv_3383.

Comment 1 nsun 2013-06-26 12:15:40 UTC
This bug still exist on OSE.
Puddle : 1.2/2013-06-25.3

Comment 3 Jason DeTiberus 2013-06-26 13:35:19 UTC
https://github.com/openshift/enterprise-server/pull/95

Comment 4 nsun 2013-06-27 06:58:41 UTC
Verified at puddle 1.2/2013-06-26.3.

Comment 5 Luke Meyer 2013-06-28 15:45:29 UTC
Closing all bugs introduced, fixed, and verified during 1.2 release work (thus never shipped).


Note You need to log in before you can comment on or make changes to this bug.