Parvis: Take dump from “select *” query.
Chamley: Haris: You can do that, if the structure and your command allows.
Jou: Spexi: yes. order matters.
Capaldo: Structure of command, meaning ?
Mankey: Select will be select * from table
Slowinski: Import will be the same, except that table will have one extra col with default value set
Trexler: Haris: Try it. If you have problems, then ask. It’s tough to list all the ways you can fail.
Emore: Haris: If you fail, you’ll have a very specific testcase to ask about.
Willette: Haris: you could use mysqldump to get the data only with the -t option. and use -c to get the column names in the insert as well. so if you import that dump to the new structure, additional columns in the table will get the default value
Willette: Haris: although I don’t understand why you can’t simply ALTER the existing table and add the new columns you want
Hench: Because of num of rows
Buchauer: Haris: Certain constraints could prevent the operation. It’s tough to guess what you have.
Jass: The alter table query takes hours and hours and hours
Betak: Don’t have that much in business hours
Hysquierdo: Spexi: read the docs. composite keys are a good concept to grasp
Willette: Haris: what happens to new rows while you import that dump?
Hartvigsen: Innodb locks the table or rows ?
Willette: Haris: for alter table statements that uses a temp copy it will lock the table. but depending on your mysql version, add column should be possible online iirc
Brixner: With -c option to mysqldump, won’t 6.5+ million row insert statement be VERY slow for import ?
Willette: Oops, I meant -d , sorry
Czyrnik: Haris: See http://hashmysql.org/wiki/InnoDB_Import_Speed for common approaches to speed up imports using the InnoDB engine
Willette: Haris: -t means “Do not write CREATE TABLE statements that create each dumped table.”, even in 5.1
Willette: Haris: so you want -t to surpress the CREATE TABLE statement
Willette: Haris: and -c does not mean “one insert per row”. it means “Use complete INSERT statements that include column names.”
Willette: Haris: so with -t -c you will get a dump that only contains the data, with batched insert statements that contain the column names. importing that dump to an existing table structure with more columns will result in the additional columns being filled with their default value except they are defined as NOT NULL without a default value
Perrota: When I want to find out how many characters are in a certain column? Like when column_test is ‘aaaa’ I want it to return 4. which function am I looking for here/
Lister: While I’m importing 6.5+ million rows, is that available for use on the server if the table is innodb ?
Wilshusen: Iaj: The other returns the number of bytes, which might not be the same.
Bridgham: Correction: while I’m importing 6.5+ million rows, is that data available for usage, from the server, if the table is innodb ?
Chiappinelli: Haris: You mean, is the data you’ve already processed available to other processes?
Lagos: Haris: Depends on isolation level.
Cestia: Haris: By default, no.
Klink: Haris: Not until committed.
Pozar: And on how you import – if you import blocks of a couple thousand and then commit which is good practice to not clog your rollback area then the bits that have been comitted are available
Wicker: Right. Not until committed.
Costella: Otoh if you import with load data infile I don’t think any of it is available until it finishes
Stflorant: Haris: But you can change that behavior by using a different isolation level.
Padgette: Xgc: yeah, but it’s not always obvious to people that you could do intermediate commits during a m*** import :-
Armstead: Trying to work my head around mysql + Excel + Power Query