aboutsummaryrefslogtreecommitdiff
path: root/doc/bugs/table_can_not_deal_with_Chinese_.mdwn
blob: fe71686e6f52ef355b683e62680bfcd818459620 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
Table directive can not deal with Chinese, when format csv

    \[[!table format=csv data="""
    a,b,c
    1,2,你好
    """
    ]]

But the below example works well.

    \[[!table format=csv data="""
    a,b,c
    1,2,3
    """
    ]]


The below example works well too

    \[[!table format=dsv delimiter=, data="""
    a,b,c
    1,2,你好
    """
    ]]

----

> You don't say what actually happens when you try this, but I hit something similar trying unicode symbols in a CSV-based table. (I wasn't aware of the DSV work-around. Thanks!) The specific error  I get trying is

    [\[!table Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.24/Encode.pm line 243.]]

> That file is owned by the `libperl5` package, but I think I've seen an error mentioning `Text::CSV` i.e. `libtext-csv-perl` when I've encountered this before. -- [[Jon]]

>> A related problem, also fixed by using DSV, is messing up the encoding of non-ASCII, non-wide characters, e.g. £ (workaround was to use £ instead) -- [[Jon]]

>>> Sorry, I have faced the same error: \[[!table Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.24/Encode.pm line 243.]] -- [[tumashu1]]