I'm developing a part of an application that's responsible for exporting some data into CSV files. The application always uses UTF-8 because of its multilingual nature at all levels. But opening such CSV files (containing e.g. diacritics, cyrillic letters, Greek letters) in Excel does not achieve the expected results showing something like Г„/Г¤, Г–/Г¶. And I don't know how to force Excel understand that the open CSV file is encoded in UTF-8. I also tried specifying UTF-8 BOM EF BB BF, but Excel ignores that.

有什么解决办法吗?

附注:哪些工具可能像Excel一样?


更新

I have to say that I've confused the community with the formulation of the question. When I was asking this question, I asked for a way of opening a UTF-8 CSV file in Excel without any problems for a user, in a fluent and transparent way. However, I used a wrong formulation asking for doing it automatically. That is very confusing and it clashes with VBA macro automation. There are two answers for this questions that I appreciate the most: the very first answer by Alex https://stackoverflow.com/a/6002338/166589, and I've accepted this answer; and the second one by Mark https://stackoverflow.com/a/6488070/166589 that have appeared a little later. From the usability point of view, Excel seemed to have lack of a good user-friendly UTF-8 CSV support, so I consider both answers are correct, and I have accepted Alex's answer first because it really stated that Excel was not able to do that transparently. That is what I confused with automatically here. Mark's answer promotes a more complicated way for more advanced users to achieve the expected result. Both answers are great, but Alex's one fits my not clearly specified question a little better.


更新2

在最后一次编辑5个月后,我注意到Alex的答案不知为何消失了。我真的希望这不是一个技术问题,我希望现在不再有关于哪个答案更好的讨论。所以我认为马克的答案是最好的。


当前回答

php生成的CSV文件也有同样的问题。 当分隔符在内容开头通过“sep=,\n”定义时(当然是在BOM之后),Excel会忽略BOM。

因此,在内容的开头添加一个BOM ("\xEF\xBB\xBF"),并通过fputcsv($fh, $data_array, ";")设置分号作为分隔符;很管用。

其他回答

php生成的CSV文件也有同样的问题。 当分隔符在内容开头通过“sep=,\n”定义时(当然是在BOM之后),Excel会忽略BOM。

因此,在内容的开头添加一个BOM ("\xEF\xBB\xBF"),并通过fputcsv($fh, $data_array, ";")设置分号作为分隔符;很管用。

嗨,我正在使用ruby on rails生成CSV。在我们的应用程序中,我们计划使用多语言(I18n),但在windows excel的CSV文件中查看I18n内容时遇到了一个问题。

Linux (Ubuntu)和mac都没问题。

我们发现windows excel需要重新导入数据才能查看实际数据。在导入时,我们将获得更多选择字符集的选项。

但这不能教育每一个用户,所以我们寻找的解决方案是只需双击打开。

然后利用aghuddleston gist确定了在windows excel中以open模式显示数据和bom格式显示数据的方法。在引用时添加。

示例I18n内容

在Mac和Linux中

瑞典语:Förnamn 中文:名字

在Windows中

瑞典语:Förnamn 中文:名字

def user_information_report(report_file_path, user_id)
    user = User.find(user_id)
    I18n.locale = user.current_lang
    open_mode = "w+:UTF-16LE:UTF-8"
    bom = "\xEF\xBB\xBF"
    body user, open_mode, bom
  end

def headers
    headers = [
        "ID", "SDN ID",
        I18n.t('sys_first_name'), I18n.t('sys_last_name'), I18n.t('sys_dob'),
        I18n.t('sys_gender'), I18n.t('sys_email'), I18n.t('sys_address'),
        I18n.t('sys_city'), I18n.t('sys_state'), I18n.t('sys_zip'),
        I18n.t('sys_phone_number')
    ]
  end

def body tenant, open_mode, bom
    File.open(report_file_path, open_mode) do |f|
      csv_file = CSV.generate(col_sep: "\t") do |csv|
        csv << headers
        tenant.patients.find_each(batch_size: 10) do |patient|
          csv <<  [
              patient.id, patient.patientid,
              patient.first_name, patient.last_name, "#{patient.dob}",
              "#{translate_gender(patient.gender)}", patient.email, "#{patient.address_1.to_s} #{patient.address_2.to_s}",
              "#{patient.city}", "#{patient.state}",  "#{patient.zip}",
              "#{patient.phone_number}"
          ]
        end
      end
      f.write bom
      f.write(csv_file)
    end
  end

这里需要注意的重要事项是open mode和bom

open_mode = "w+:UTF-16LE:UTF-8"

好= "\xEF\xBB\xBF"

在写入CSV之前插入BOM

f.write好

f.write (csv_file)

Windows和Mac

双击即可直接打开文件。

Linux (ubuntu)

当打开一个文件时,询问分隔符选项->选择“TAB”

首先将Excel电子表格保存为Unicode文本。使用ie浏览器打开TXT文件,点击“另存为”TXT编码-选择合适的编码,例如Win Cyrillic 1251

我们使用了以下方法:

转换CSV到UTF-16 LE 在文件开头插入BOM 使用制表符作为字段分隔符

我尝试了我能在这个帖子上找到的一切,类似的,没有什么是完全有效的。然而,导入到谷歌表和简单地下载为csv工作就像一个魅力。如果你到了我的挫败点,可以试试。