java使用AudioWaveformCreator绘制2个(多个)波形

时间:2019-01-24 14:35:15

标签: java audio wav wave

此问题与用于在this线程上回答的代码有关。我使用的是Nicholas DiPiazza发布的代码,后来使用的是Andrew Thompson发布的代码。 我将第二个AudioWaveformCreator添加到此代码中,并且两个AWC的结果相同。我不知道为什么。我想做的是在一个JOptionpane中显示2个不同的波形(来自不同的文件)。

import java.awt.BasicStroke;
import java.awt.Color;
import java.awt.Font;
import java.awt.Graphics2D;
import java.awt.font.FontRenderContext;
import java.awt.font.LineBreakMeasurer;
import java.awt.font.TextAttribute;
import java.awt.font.TextLayout;
import java.awt.geom.Line2D;
import java.awt.image.BufferedImage;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.net.URL;
import java.text.AttributedCharacterIterator;
import java.text.AttributedString;
import java.util.Vector;

import javax.imageio.ImageIO;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.TargetDataLine;
import javax.sound.sampled.UnsupportedAudioFileException;
import javax.swing.ImageIcon;
import javax.swing.JLabel;
import javax.swing.JOptionPane;


public class AudioWaveformCreator2 {
    AudioInputStream audioInputStream;
    Vector<Line2D.Double> lines = new Vector<Line2D.Double>();
    String errStr;
    Capture capture = new Capture();
    double duration, seconds;
    //File file;
    String fileName = "out.png";
    SamplingGraph samplingGraph;
    String waveformFilename;
    Color imageBackgroundColor = new Color(20,20,20);
    Object result = null;

    public AudioWaveformCreator2(File url, String waveformFilename) throws Exception {
        if (url != null) {
            try {
                errStr = null;
                audioInputStream = AudioSystem.getAudioInputStream(url);
                long milliseconds = (long)((audioInputStream.getFrameLength() * 1000) / audioInputStream.getFormat().getFrameRate());
                duration = milliseconds / 1000.0;
                samplingGraph = new SamplingGraph();
                samplingGraph.createWaveForm(null);                
            } catch (Exception ex) { 
                reportStatus(ex.toString());
                throw ex;
            }
        } else {
            reportStatus("Audio file required.");
        }
    }
    /**
     * Render a WaveForm.
     */
    class SamplingGraph implements Runnable {

        private Thread thread;
        private Font font10 = new Font("serif", Font.PLAIN, 10);
        private Font font12 = new Font("serif", Font.PLAIN, 12);
        Color jfcBlue = new Color(000, 000, 255);
        Color pink = new Color(255, 175, 175);


        public SamplingGraph() {
        }


        public void createWaveForm(byte[] audioBytes) {

            lines.removeAllElements();  // clear the old vector

            AudioFormat format = audioInputStream.getFormat();
            if (audioBytes == null) {
                try {
                    audioBytes = new byte[
                        (int) (audioInputStream.getFrameLength() 
                        * format.getFrameSize())];
                    audioInputStream.read(audioBytes);
                } catch (Exception ex) { 
                    reportStatus(ex.getMessage());
                    return; 
                }
            }
            int w = 500;
            int h = 200;
            int[] audioData = null;
            if (format.getSampleSizeInBits() == 16) {
                 int nlengthInSamples = audioBytes.length / 2;
                 audioData = new int[nlengthInSamples];
                 if (format.isBigEndian()) {
                    for (int i = 0; i < nlengthInSamples; i++) {
                         /* First byte is MSB (high order) */
                         int MSB = (int) audioBytes[2*i];
                         /* Second byte is LSB (low order) */
                         int LSB = (int) audioBytes[2*i+1];
                         audioData[i] = MSB << 8 | (255 & LSB);
                     }
                 } else {
                     for (int i = 0; i < nlengthInSamples; i++) {
                         /* First byte is LSB (low order) */
                         int LSB = (int) audioBytes[2*i];
                         /* Second byte is MSB (high order) */
                         int MSB = (int) audioBytes[2*i+1];
                         audioData[i] = MSB << 8 | (255 & LSB);
                     }
                 }
             } else if (format.getSampleSizeInBits() == 8) {
                 int nlengthInSamples = audioBytes.length;
                 audioData = new int[nlengthInSamples];
                 if (format.getEncoding().toString().startsWith("PCM_SIGN")) {
                     for (int i = 0; i < audioBytes.length; i++) {
                         audioData[i] = audioBytes[i];
                     }
                 } else {
                     for (int i = 0; i < audioBytes.length; i++) {
                         audioData[i] = audioBytes[i] - 128;
                     }
                 }
            }

            int frames_per_pixel = audioBytes.length / format.getFrameSize()/w;
            byte my_byte = 0;
            double y_last = 0;
            int numChannels = format.getChannels();
            for (double x = 0; x < w && audioData != null; x++) {
                int idx = (int) (frames_per_pixel * numChannels * x);
                if (format.getSampleSizeInBits() == 8) {
                     my_byte = (byte) audioData[idx];
                } else {
                     my_byte = (byte) (128 * audioData[idx] / 32768 );
                }
                double y_new = (double) (h * (128 - my_byte) / 256);
                lines.add(new Line2D.Double(x, y_last, x, y_new));
                y_last = y_new;
            }
            saveToFile();
        }


        public void saveToFile() {            
            int w = 500;
            int h = 200;
            int INFOPAD = 0;

            BufferedImage bufferedImage = new BufferedImage(w, h, BufferedImage.TYPE_INT_RGB);
            Graphics2D g2 = bufferedImage.createGraphics();

            createSampleOnGraphicsContext(w, h, INFOPAD, g2);            
            g2.dispose();
            // Write generated image to a file
            try {
                // Save as PNG
                File file = new File(fileName);
                System.out.println(file.getAbsolutePath());
                ImageIO.write(bufferedImage, "png", file);
                result =  new ImageIcon(fileName);
            } catch (IOException e) {
            }
        }


        private void createSampleOnGraphicsContext(int w, int h, int INFOPAD, Graphics2D g2) {            
            g2.setBackground(imageBackgroundColor);
            g2.clearRect(0, 0, w, h);
            g2.setColor(Color.white);
            g2.fillRect(0, h-INFOPAD, w, INFOPAD);

            if (errStr != null) {
                g2.setColor(jfcBlue);
                g2.setFont(new Font("serif", Font.BOLD, 18));
                g2.drawString("ERROR", 5, 20);
                AttributedString as = new AttributedString(errStr);
                as.addAttribute(TextAttribute.FONT, font12, 0, errStr.length());
                AttributedCharacterIterator aci = as.getIterator();
                FontRenderContext frc = g2.getFontRenderContext();
                LineBreakMeasurer lbm = new LineBreakMeasurer(aci, frc);
                float x = 5, y = 25;
                lbm.setPosition(0);
                while (lbm.getPosition() < errStr.length()) {
                    TextLayout tl = lbm.nextLayout(w-x-5);
                    if (!tl.isLeftToRight()) {
                        x = w - tl.getAdvance();
                    }
                    tl.draw(g2, x, y += tl.getAscent());
                    y += tl.getDescent() + tl.getLeading();
                }
            } else if (capture.thread != null) {
                g2.setColor(Color.black);
                g2.setFont(font12);
                //g2.drawString("Length: " + String.valueOf(seconds), 3, h-4);
            } else {
                g2.setColor(Color.black);
                g2.setFont(font12);
                //g2.drawString("File: " + fileName + "  Length: " + String.valueOf(duration) + "  Position: " + String.valueOf(seconds), 3, h-4);

                if (audioInputStream != null) {
                    // .. render sampling graph ..
                    g2.setColor(jfcBlue);
                    for (int i = 1; i < lines.size(); i++) {
                        g2.draw((Line2D) lines.get(i));
                    }

                    // .. draw current position ..
                    if (seconds != 0) {
                        double loc = seconds/duration*w;
                        g2.setColor(pink);
                        g2.setStroke(new BasicStroke(3));
                        g2.draw(new Line2D.Double(loc, 0, loc, h-INFOPAD-2));
                    }
                }
            }
        }

        public void start() {
            thread = new Thread(this);
            thread.setName("SamplingGraph");
            thread.start();
            seconds = 0;
        }

        public void stop() {
            if (thread != null) {
                thread.interrupt();
            }
            thread = null;
        }

        public void run() {
            seconds = 0;
            while (thread != null) {
                if ( (capture.line != null) && (capture.line.isActive()) ) {
                    long milliseconds = (long)(capture.line.getMicrosecondPosition() / 1000);
                    seconds =  milliseconds / 1000.0;
                }
                try { thread.sleep(100); } catch (Exception e) { break; }                              
                while ((capture.line != null && !capture.line.isActive())) 
                {
                    try { thread.sleep(10); } catch (Exception e) { break; }
                }
            }
            seconds = 0;
        }
    } // End class SamplingGraph

    /** 
     * Reads data from the input channel and writes to the output stream
     */
    class Capture implements Runnable {

        TargetDataLine line;
        Thread thread;

        public void start() {
            errStr = null;
            thread = new Thread(this);
            thread.setName("Capture");
            thread.start();
        }

        public void stop() {
            thread = null;
        }

        private void shutDown(String message) {
            if ((errStr = message) != null && thread != null) {
                thread = null;
                samplingGraph.stop();                
                System.err.println(errStr);
            }
        }

        public void run() {

            duration = 0;
            audioInputStream = null;

            // define the required attributes for our line, 
            // and make sure a compatible line is supported.

            AudioFormat format = audioInputStream.getFormat();
            DataLine.Info info = new DataLine.Info(TargetDataLine.class, 
                format);

            if (!AudioSystem.isLineSupported(info)) {
                shutDown("Line matching " + info + " not supported.");
                return;
            }

            // get and open the target data line for capture.

            try {
                line = (TargetDataLine) AudioSystem.getLine(info);
                line.open(format, line.getBufferSize());
            } catch (LineUnavailableException ex) { 
                shutDown("Unable to open the line: " + ex);
                return;
            } catch (SecurityException ex) { 
                shutDown(ex.toString());
                //JavaSound.showInfoDialog();
                return;
            } catch (Exception ex) { 
                shutDown(ex.toString());
                return;
            }

            // play back the captured audio data
            ByteArrayOutputStream out = new ByteArrayOutputStream();
            int frameSizeInBytes = format.getFrameSize();
            int bufferLengthInFrames = line.getBufferSize() / 8;
            int bufferLengthInBytes = bufferLengthInFrames * frameSizeInBytes;
            byte[] data = new byte[bufferLengthInBytes];
            int numBytesRead;

            line.start();

            while (thread != null) {
                if((numBytesRead = line.read(data, 0, bufferLengthInBytes)) == -1) {
                    break;
                }
                out.write(data, 0, numBytesRead);
            }

            // we reached the end of the stream.  stop and close the line.
            line.stop();
            line.close();
            line = null;

            // stop and close the output stream
            try {
                out.flush();
                out.close();
            } catch (IOException ex) {
                ex.printStackTrace();
            }

            // load bytes into the audio input stream for playback

            byte audioBytes[] = out.toByteArray();
            ByteArrayInputStream bais = new ByteArrayInputStream(audioBytes);
            audioInputStream = new AudioInputStream(bais, format, audioBytes.length / frameSizeInBytes);

            long milliseconds = (long)((audioInputStream.getFrameLength() * 1000) / format.getFrameRate());
            duration = milliseconds / 1000.0;

            try {
                audioInputStream.reset();
            } catch (Exception ex) { 
                ex.printStackTrace(); 
                return;
            }

            samplingGraph.createWaveForm(audioBytes);
        }
    } // End class Capture    

    public static void main(String [] args) throws Exception {

        AudioWaveformCreator2 awc = new AudioWaveformCreator2(new File("E:/PRODI ILKOM/Semester VIII/TA/wave/cars062.wav"), "cars062.png");
        AudioWaveformCreator2 awc2 = new AudioWaveformCreator2(new File("E:/PRODI ILKOM/Semester VIII/TA/wave/plain wav.wav"), "plain wav.png");
        Object[] fields = {
                "Plain", awc.result
                ,"Stego", awc2.result
        };
        JOptionPane.showConfirmDialog(null, fields, "Wave Form", JOptionPane.PLAIN_MESSAGE);
    }

    private void reportStatus(String msg) {
        if ((errStr = msg) != null) {
            System.out.println(errStr);            
        }
    }

    private static void printUsage() {
        System.out.println("AudioWaveformCreator usage: java AudioWaveformCreator.class [path to audio file for generating the image] [path to save waveform image to]");
    }
}

这就是我得到的两个波形:

enter image description here

1 个答案:

答案 0 :(得分:0)

创建AudioWaveformCreator2实例时,随后执行SamplingGraph#saveToFile方法。该方法将先前生成的波形存储在文件fileName中,其中fileName是一个AudioWaveformCreator2字段,使用固定名称out.png初始化。因此,当创建多个AudioWaveformCreator2实例时,两个实例都将其数据存储在相同文件out.png中,第二个文件将覆盖第一个文件。在AudioWaveformCreator2实例存储文件之后,使用ImageIcon构造函数创建一个新的ImageIcon(String filename)ImageIcon(例如,http://hg.openjdk.java.net/jdk10/jdk10/jdk/file/777356696811/src/java.desktop/share/classes/javax/swing/ImageIcon.java)的源代码表明,ImageIcon(String filename)构造函数在某个时间以后调用Toolkit.getDefaultToolkit().getImage(filename)方法。 该方法的说明表明,存在一种缓存机制,可为具有相同文件名的请求返回相同的图像(例如,https://docs.oracle.com/javase/10/docs/api/java/awt/Toolkit.html#getImage(java.lang.String)):

  

返回从指定文件中获取像素数据的图像,其图像   格式可以是GIF,JPEG或PNG。基础工具箱尝试   将具有相同文件名的多个请求解析为相同   返回的图像。由于需要促进这一机制   共享图像对象可能会继续保留非   无限期地使用更长的时间,开发人员   鼓励使用来实现自己的图像缓存   可用的createImage变体。如果图像数据包含在   指定的文件更改,由此返回的Image对象   方法可能仍包含从加载的过时信息   先前的呼叫后保存文件。先前加载的图像数据可以手动进行   通过在返回的Image上调用flush方法将其丢弃。

与固定名称out.png结合使用的缓存 负责观察到的行为:尽管第二个AudioWaveformCreator2实例 覆盖out.png文件,缓存机制提供了第一张图像,因此,第一张图像显示了两次:

enter image description here

可能的解决方案是在AudioWaveformCreator2构造函数中传递和初始化文件名​​:

public AudioWaveformCreator2(File url, String waveformFilename, String fileName) throws Exception {
    if (url != null) {
        try {
            this.fileName = fileName;
    ...

AudioWaveformCreator2 awc = new AudioWaveformCreator2(new File("E:/PRODI ILKOM/Semester VIII/TA/wave/cars062.wav"), "cars062.png", "out.png");
AudioWaveformCreator2 awc2 = new AudioWaveformCreator2(new File("E:/PRODI ILKOM/Semester VIII/TA/wave/plain wav.wav"), "plain wav.png", "out2.png");
...

其中第一个AudioWaveformCreator2实例将图像存储在文件out.png中,第二个AudioWaveformCreator2实例存储在文件out2.png中。然后,缓存机制可以区分两个图像:

enter image description here

还有其他解决方案,可以保留存储在相同文件out.png中(即,无需修改AudioWaveformCreator2构造函数),例如Toolkit.getDefaultToolkit().createImage(filename)方法的用法,如下所述(例如,参见https://docs.oracle.com/javase/10/docs/api/java/awt/Toolkit.html#createImage(java.lang.String)):

  

返回一个图像,该图像从指定的URL获取像素数据。的   返回的Image是一个不会与其他对象共享的新对象   此方法或其getImage变体的调用方。

因此,不涉及缓存机制,而解决方法只是替换

result = new ImageIcon(fileName);

result = new ImageIcon(Toolkit.getDefaultToolkit().createImage(fileName));

第三个避免缓存机制的解决方案是替换

result = new ImageIcon(fileName);

result =  new ImageIcon(bufferedImage);

因为ImageIcon(Image image)构造函数不使用Toolkit.getDefaultToolkit().getImage(filename)方法,而是直接使用 bufferedImage中包含的数据。