使用SqlDataReader获取二进制数据

时间:2011-03-20 20:15:26

标签: c# ado.net

我有一个名为Blob的表(Id(int),Data(Image))。我需要使用SqlDataReader来获取图像数据。请注意,我不想将Response.Binarywrite()数据发送到浏览器。我只需要将二进制数据作为byte []来使用相同的内部操作。我能想到的唯一方法是使用SqlDataReader获取id并再次使用SqlCommand.ExecuteScalar()将其作为给定id的byte []。我可以只使用SqlDataReader(SqlCommand.ExecuteReader)将该图像数据作为byte []吗?我错过了什么吗?

7 个答案:

答案 0 :(得分:63)

您应该可以通过以下方式获取:(byte[])reader.Items["Data"]

另请注意,image数据类型已弃用,将在SQL Server的未来版本中删除;请改用varbinary(max)

答案 1 :(得分:18)

是的,您可以使用SqlDataReader.GetBytes。您可能希望在第一次调用中为缓冲区传递null,以找出有多少数据,然后使用适当大小的缓冲区再次调用它。

可能只能使用索引器并将结果转换为字节数组 - 我不确定。值得一试:))

答案 2 :(得分:10)

在.NET Framework 4.5中,您可以使用GetStream方法将二进制数据作为Stream。

进行访问

答案 3 :(得分:7)

来自MSDN。不知道为什么我以前找不到。

    SqlConnection pubsConn = new SqlConnection("Data Source=localhost;Integrated Security=SSPI;Initial Catalog=pubs;");
    SqlCommand logoCMD = new SqlCommand("SELECT pub_id, logo FROM pub_info", pubsConn);

    FileStream fs;                          // Writes the BLOB to a file (*.bmp).
    BinaryWriter bw;                        // Streams the BLOB to the FileStream object.

    int bufferSize = 100;                   // Size of the BLOB buffer.
    byte[] outbyte = new byte[bufferSize];  // The BLOB byte[] buffer to be filled by GetBytes.
    long retval;                            // The bytes returned from GetBytes.
    long startIndex = 0;                    // The starting position in the BLOB output.

    string pub_id = "";                     // The publisher id to use in the file name.

    // Open the connection and read data into the DataReader.
    pubsConn.Open();
    SqlDataReader myReader = logoCMD.ExecuteReader(CommandBehavior.SequentialAccess);

    while (myReader.Read())
    {
      // Get the publisher id, which must occur before getting the logo.
      pub_id = myReader.GetString(0);  

      // Create a file to hold the output.
      fs = new FileStream("logo" + pub_id + ".bmp", FileMode.OpenOrCreate, FileAccess.Write);
      bw = new BinaryWriter(fs);

      // Reset the starting byte for the new BLOB.
      startIndex = 0;

      // Read the bytes into outbyte[] and retain the number of bytes returned.
      retval = myReader.GetBytes(1, startIndex, outbyte, 0, bufferSize);

      // Continue reading and writing while there are bytes beyond the size of the buffer.
      while (retval == bufferSize)
      {
        bw.Write(outbyte);
        bw.Flush();

        // Reposition the start index to the end of the last buffer and fill the buffer.
        startIndex += bufferSize;
        retval = myReader.GetBytes(1, startIndex, outbyte, 0, bufferSize);
      }

      // Write the remaining buffer.
      if(retval > 0) // if file size can divide to buffer size
          bw.Write(outbyte, 0, (int)retval); //original MSDN source had retval-1, a bug
      bw.Flush();

      // Close the output file.
      bw.Close();
      fs.Close();
    }

    // Close the reader and the connection.
    myReader.Close();
    pubsConn.Close();

答案 4 :(得分:2)

使用此功能可以安全灵活地读取字节:

select 
    table_name as [Table Name]
  , column_name as [Column Name]
from information_schema.columns 
where data_type = 'nvarchar' 
  and character_maximum_length=-1

答案 5 :(得分:1)

无需使用阅读器。只需使用数据集从数据库中获取值(使用存储的Proc或任何其他方法),然后只输入字节(下面的代码)并将其存储在字节数组中。你的工作已经完成。

byte[] productImage;
productImage = (byte[])ds.Tables[0].Rows[0]["Image"];

答案 6 :(得分:0)

这是一个老问题,我一直在使用 Anton Bakulev 上面的答案一段时间,直到我遇到一个情况,我的数据实际上大于 int curPos 可以处理的 2GB。当我尝试将 bufferIndex 参数更改为 0 时,超出 bufferSize 的任何内容都返回损坏。 (此外,那个微小的缓冲区大小使得加载超过 2MB 的任何内容都变得痛苦。

不,您的数据库中的单个列中可能不应该有超过 2GB 的数据。尽量避免这种情况。但为了以防万一,这里有一个更健壮、更精简的代码版本,作为 SqlDataReader 扩展方法:

public static byte[] ParseStrictByteArray(this SqlDataReader reader, string columnName)
{
    int colIdx = reader.GetOrdinal(columnName);
    long size = reader.GetBytes(colIdx, 0, null, 0, 0);
    byte[] imageValue = new byte[size];
    // essentially, we are loading all this data in memory, either way... Might as well do it in one swoop if we can
    int bufferSize = (int)Math.Min(int.MaxValue, size); 
    //int.MaxValue = 2,147,483,647 = roughly 2 GB of data, so if the data > 2GB we have to read in chunks
            
    if(size > bufferSize){

        long bytesRead = 0;
        int position = 0;
        //we need to copy the data over, which means we DON'T want a full copy of all the data in memory. 
        //We need to reduce the buffer size (but not too much, as multiple calls to the reader also affect performance a lot)
        bufferSize = 104857600; //this is roughly 100MB
        byte[] buffer = new byte[bufferSize];
        while (bytesRead < size)
        {
            if (size - bytesRead < bufferSize)
                bufferSize = Convert.ToInt32(size - bytesRead);

            bytesRead += reader.GetBytes(colIdx, position, buffer, 0, bufferSize);
            //shift the buffer into the final array
            Array.Copy(buffer, 0, imageValue, position, bufferSize);
            position += bufferSize;
        }
    }
    else 
    {
        //single read into the image buffer
        reader.GetBytes(colIdx, 0, imageValue, 0, bufferSize);
    }

    return imageValue;
}