Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
M
mariadb
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
Analytics
Analytics
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Commits
Issue Boards
Open sidebar
Kirill Smelkov
mariadb
Commits
c8967495
Commit
c8967495
authored
Jul 30, 2010
by
Georgi Kodinov
Browse files
Options
Browse Files
Download
Plain Diff
merge
parents
a79187b8
d765e30a
Changes
4
Show whitespace changes
Inline
Side-by-side
Showing
4 changed files
with
54 additions
and
2 deletions
+54
-2
mysql-test/r/group_by.result
mysql-test/r/group_by.result
+35
-0
mysql-test/t/group_by.test
mysql-test/t/group_by.test
+14
-0
sql/item.h
sql/item.h
+1
-0
sql/item_buff.cc
sql/item_buff.cc
+4
-2
No files found.
mysql-test/r/group_by.result
View file @
c8967495
...
...
@@ -1811,6 +1811,41 @@ MAX(t2.a)
2
DROP TABLE t1, t2;
#
# Bug#55188: GROUP BY, GROUP_CONCAT and TEXT - inconsistent results
#
CREATE TABLE t1 (a text, b varchar(10));
INSERT INTO t1 VALUES (repeat('1', 1300),'one'), (repeat('1', 1300),'two');
EXPLAIN
SELECT SUBSTRING(a,1,10), LENGTH(a), GROUP_CONCAT(b) FROM t1 GROUP BY a;
id 1
select_type SIMPLE
table t1
type ALL
possible_keys NULL
key NULL
key_len NULL
ref NULL
rows 2
Extra Using filesort
SELECT SUBSTRING(a,1,10), LENGTH(a), GROUP_CONCAT(b) FROM t1 GROUP BY a;
SUBSTRING(a,1,10) LENGTH(a) GROUP_CONCAT(b)
1111111111 1300 one,two
EXPLAIN
SELECT SUBSTRING(a,1,10), LENGTH(a) FROM t1 GROUP BY a;
id 1
select_type SIMPLE
table t1
type ALL
possible_keys NULL
key NULL
key_len NULL
ref NULL
rows 2
Extra Using temporary; Using filesort
SELECT SUBSTRING(a,1,10), LENGTH(a) FROM t1 GROUP BY a;
SUBSTRING(a,1,10) LENGTH(a)
1111111111 1300
DROP TABLE t1;
# End of 5.1 tests
#
# Bug#49771: Incorrect MIN (date) when minimum value is 0000-00-00
...
...
mysql-test/t/group_by.test
View file @
c8967495
...
...
@@ -1222,6 +1222,20 @@ DROP TABLE t1, t2;
--
echo
#
--
echo
# Bug#55188: GROUP BY, GROUP_CONCAT and TEXT - inconsistent results
--
echo
#
CREATE
TABLE
t1
(
a
text
,
b
varchar
(
10
));
INSERT
INTO
t1
VALUES
(
repeat
(
'1'
,
1300
),
'one'
),
(
repeat
(
'1'
,
1300
),
'two'
);
query_vertical
EXPLAIN
SELECT
SUBSTRING
(
a
,
1
,
10
),
LENGTH
(
a
),
GROUP_CONCAT
(
b
)
FROM
t1
GROUP
BY
a
;
SELECT
SUBSTRING
(
a
,
1
,
10
),
LENGTH
(
a
),
GROUP_CONCAT
(
b
)
FROM
t1
GROUP
BY
a
;
query_vertical
EXPLAIN
SELECT
SUBSTRING
(
a
,
1
,
10
),
LENGTH
(
a
)
FROM
t1
GROUP
BY
a
;
SELECT
SUBSTRING
(
a
,
1
,
10
),
LENGTH
(
a
)
FROM
t1
GROUP
BY
a
;
DROP
TABLE
t1
;
--
echo
# End of 5.1 tests
...
...
sql/item.h
View file @
c8967495
...
...
@@ -2982,6 +2982,7 @@ public:
class
Cached_item_str
:
public
Cached_item
{
Item
*
item
;
uint32
value_max_length
;
String
value
,
tmp_value
;
public:
Cached_item_str
(
THD
*
thd
,
Item
*
arg
);
...
...
sql/item_buff.cc
View file @
c8967495
...
...
@@ -65,7 +65,9 @@ Cached_item::~Cached_item() {}
*/
Cached_item_str
::
Cached_item_str
(
THD
*
thd
,
Item
*
arg
)
:
item
(
arg
),
value
(
min
(
arg
->
max_length
,
thd
->
variables
.
max_sort_length
))
:
item
(
arg
),
value_max_length
(
min
(
arg
->
max_length
,
thd
->
variables
.
max_sort_length
)),
value
(
value_max_length
)
{}
bool
Cached_item_str
::
cmp
(
void
)
...
...
@@ -74,7 +76,7 @@ bool Cached_item_str::cmp(void)
bool
tmp
;
if
((
res
=
item
->
val_str
(
&
tmp_value
)))
res
->
length
(
min
(
res
->
length
(),
value
.
alloced_length
()
));
res
->
length
(
min
(
res
->
length
(),
value
_max_length
));
if
(
null_value
!=
item
->
null_value
)
{
if
((
null_value
=
item
->
null_value
))
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment