生成(不是这样)具有特定字符串出现的随机字符串

时间:2015-04-23 10:33:21

标签: c++ string random

我有一个要求,我有字母' ACGT'我需要创建一个大约20,000个字符的字符串。该字符串应包含100多次出现的模式" CCGT"。大多数情况下,生成的字符串包含大约20-30个实例。

private

如何调整代码以使模式更频繁地出现?

编辑 - 有没有办法改变字母表,即 - ' A'' C',' G'' T&#39 ;,' CCGT'作为字母表的字符?

谢谢。

6 个答案:

答案 0 :(得分:2)

生成包含100 x 0和490 1s,2s,3s和4s的整数数组 [000000 .... 111111 .... 2222等]制作了将近20,000个条目。

然后随机改变它(std :: random_shuffle)

然后写一个字符串,其中每个0转换为'CCGT',每个1转换为'A',每个2 ....等等

我认为这可以为您提供所需的内容,通过调整原始的整数数组,您也可以更改输出中的“A”字符数。

编辑:如果这不是随机的,那么在开始时做100 0,然后在其余时间做1-4。

答案 1 :(得分:1)

我能想到的唯一符合“100+”标准的解决方案是:

create 20000 character string
number of instances (call it n) = 100 + some random value
for (i = 0 ; i < n ; ++i)
{
   pick random start position
   write CCGT
}

当然,您需要确保已覆盖的字符不属于“CCGT”。

答案 2 :(得分:1)

我的第一个想法是生成一个包含100个索引的列表,您肯定会插入特殊字符串。然后在生成随机字符串时,在到达它们时在每个索引处插入特殊字符串。

我错过了检查间隔是否适当间隔(不能在另一个间隔的4之内)并且还按升序对它们进行排序 - 这两者都是必要的。

int N = 20000;
std::string alphabet("ACGT");
int intervals[100];
for (int index = 0; index < 100; index)
{
    intervals[index] = rand() % 2000;
    // Do some sort of check to make sure each element of intervals is not
    // within 4 of another element and that no elements are repeated
}
// Sort the intervals array in ascending order
int current_interval_index = 0;
std::string str;
str.reserve(N);
for (int index = 0; index < N; index++)
{
    if (index == intervals[current_interval_index])
    {
        str += alphabet;
        current_interval_index++;
        index += 3;
    }
    else
    {
        str += alphabet[rand() % (alphabet.length())];
    }
}

答案 3 :(得分:1)

我提出的解决方案使用std::vector来包含所有随机的4个字符集,包括100个特殊序列。然后,我随后将该矢量随机分配,以在整个字符串中随机分配100个特殊序列。

为了进行字母分发,我创建了另一个名为alphabet的{​​{1}}字符串,其中包含相对丰富的weighted个字符,这些字符已经包含在100个特殊序列中

alphabet

答案 4 :(得分:1)

我喜欢@Andy Newman的回答并认为这可能是最好的方式 - 下面的代码是他们建议的可编辑的例子。

package com.android.grafika;

import android.graphics.SurfaceTexture;
import android.opengl.EGLContext;
import android.opengl.GLES20;
import android.os.Handler;
import android.os.Looper;
import android.os.Message;
import android.util.Log;

import com.android.grafika.gles.EglCore;
import com.android.grafika.gles.FullFrameRect;
import com.android.grafika.gles.Texture2dProgram;
import com.android.grafika.gles.WindowSurface;

import java.io.File;
import java.io.IOException;
import java.lang.ref.WeakReference;

/**
 * Encode a movie from frames rendered from an external texture image.
 * <p>
 * The object wraps an encoder running on a dedicated thread.  The various control messages
 * may be sent from arbitrary threads (typically the app UI thread).  The encoder thread
 * manages both sides of the encoder (feeding and draining); the only external input is
 * the GL texture.
 * <p>
 * The design is complicated slightly by the need to create an EGL context that shares state
 * with a view that gets restarted if (say) the device orientation changes.  When the view
 * in question is a GLSurfaceView, we don't have full control over the EGL context creation
 * on that side, so we have to bend a bit backwards here.
 * <p>
 * To use:
 * <ul>
 * <li>create TextureMovieEncoder object
 * <li>create an EncoderConfig
 * <li>call TextureMovieEncoder#startRecording() with the config
 * <li>call TextureMovieEncoder#setTextureId() with the texture object that receives frames
 * <li>for each frame, after latching it with SurfaceTexture#updateTexImage(),
 *     call TextureMovieEncoder#frameAvailable().
 * </ul>
 *
 * TODOO: tweak the API (esp. textureId) so it's less awkward for simple use cases.
 */
public class TextureMovieEncoder implements Runnable {
    private static final String TAG = MainActivity.TAG;
    private static final boolean VERBOSE = false;

    private static final long timestampCorrection = 1000000000;
    private long timestampCorected;

    private static final int MSG_START_RECORDING = 0;
    private static final int MSG_STOP_RECORDING = 1;
    private static final int MSG_FRAME_AVAILABLE = 2;
    private static final int MSG_SET_TEXTURE_ID = 3;
    private static final int MSG_UPDATE_SHARED_CONTEXT = 4;
    private static final int MSG_QUIT = 5;

    private boolean measure_started = false;
    private long startTime = -1;

    private int cycle = 0;

    private long handleFrameTime = 0;

    private long last_timestamp = -1;

    private float [] transform;

    private long last_orig_timestamp = -1;

    public long getFrame() {
        return frame;
    }

    private long frame = 0;
    private long average_diff = 0;
    private long step = 40000000;
    private long actTimestamp = 0;
    private boolean shouldStop = false;

    public void setmSpeedCallback(SpeedControlCallback mSpeedCallback) {
        this.mSpeedCallback = mSpeedCallback;
    }

    private SpeedControlCallback mSpeedCallback;

    // ----- accessed exclusively by encoder thread -----
    private WindowSurface mInputWindowSurface;
    private EglCore mEglCore;
    private FullFrameRect mFullScreen;
    private int mTextureId;
    private VideoEncoderCore mVideoEncoder;

    // ----- accessed by multiple threads -----
    private volatile EncoderHandler mHandler;

    private Object mReadyFence = new Object();      // guards ready/running
    private boolean mReady;
    private boolean mRunning;

    /**
     * Encoder configuration.
     * <p>
     * Object is immutable, which means we can safely pass it between threads without
     * explicit synchronization (and don't need to worry about it getting tweaked out from
     * under us).
     * <p>
     * TODO: make frame rate and iframe interval configurable?  Maybe use builder pattern
     *       with reasonable defaults for those and bit rate.
     */
    public static class EncoderConfig {
        final File mOutputFile;
        final int mWidth;
        final int mHeight;
        final int mBitRate;
        final EGLContext mEglContext;

        public EncoderConfig(File outputFile, int width, int height, int bitRate,
                EGLContext sharedEglContext) {
            mOutputFile = outputFile;
            mWidth = width;
            mHeight = height;
            mBitRate = bitRate;
            mEglContext = sharedEglContext;
        }

        @Override
        public String toString() {
            return "EncoderConfig: " + mWidth + "x" + mHeight + " @" + mBitRate +
                    " to '" + mOutputFile.toString() + "' ctxt=" + mEglContext;
        }
    }

    /**
     * Tells the video recorder to start recording.  (Call from non-encoder thread.)
     * <p>
     * Creates a new thread, which will create an encoder using the provided configuration.
     * <p>
     * Returns after the recorder thread has started and is ready to accept Messages.  The
     * encoder may not yet be fully configured.
     */
    public void startRecording(EncoderConfig config) {
        Log.d(TAG, "Encoder: startRecording()");
        synchronized (mReadyFence) {
            if (mRunning) {
                Log.w(TAG, "Encoder thread already running");
                return;
            }
            mRunning = true;
            new Thread(this, "TextureMovieEncoder").start();
            while (!mReady) {
                try {
                    mReadyFence.wait();
                } catch (InterruptedException ie) {
                    // ignore
                }
            }
        }

        mHandler.sendMessage(mHandler.obtainMessage(MSG_START_RECORDING, config));
    }

    /**
     * Tells the video recorder to stop recording.  (Call from non-encoder thread.)
     * <p>
     * Returns immediately; the encoder/muxer may not yet be finished creating the movie.
     * <p>
     * TODO: have the encoder thread invoke a callback on the UI thread just before it shuts down
     * so we can provide reasonable status UI (and let the caller know that movie encoding
     * has completed).
     */
    public void stopRecording() {
        //mHandler.sendMessage(mHandler.obtainMessage(MSG_STOP_RECORDING));
        //mHandler.sendMessage(mHandler.obtainMessage(MSG_QUIT));
        // We don't know when these will actually finish (or even start).  We don't want to
        // delay the UI thread though, so we return immediately.
        shouldStop = true;
        Log.d(TAG, "Shout down flag set up.");
    }

    /**
     * Returns true if recording has been started.
     */
    public boolean isRecording() {
        synchronized (mReadyFence) {
            return mRunning;
        }
    }

    /**
     * Tells the video recorder to refresh its EGL surface.  (Call from non-encoder thread.)
     */
    public void updateSharedContext(EGLContext sharedContext) {
        mHandler.sendMessage(mHandler.obtainMessage(MSG_UPDATE_SHARED_CONTEXT, sharedContext));
    }

    /**
     * Tells the video recorder that a new frame is available.  (Call from non-encoder thread.)
     * <p>
     * This function sends a message and returns immediately.  This isn't sufficient -- we
     * don't want the caller to latch a new frame until we're done with this one -- but we
     * can get away with it so long as the input frame rate is reasonable and the encoder
     * thread doesn't stall.
     * <p>
     * TODO: either block here until the texture has been rendered onto the encoder surface,
     * or have a separate "block if still busy" method that the caller can execute immediately
     * before it calls updateTexImage().  The latter is preferred because we don't want to
     * stall the caller while this thread does work.
     */
    public void frameAvailable(SurfaceTexture st) {
        synchronized (mReadyFence) {
            if (!mReady) {
                return;
            }
        }

        transform = new float[16];      // TODOO - avoid alloc every frame
        st.getTransformMatrix(transform);

        long timestamp = st.getTimestamp();

        // if first frame
        if (last_timestamp < 0) {
            if (!measure_started) {
                startTime = System.currentTimeMillis();
                measure_started = true;
            }

            last_timestamp = timestamp;
            last_orig_timestamp = timestamp;
        }
        else {

            // HARDCODED FRAME NUMBER :(
            // if playback finished or frame number reached
            if ((frame == 200) || shouldStop) {
                if (measure_started) {
                    long stopTime = System.currentTimeMillis();
                    long elapsedTime = stopTime - startTime;
                    Log.d(TAG, "Rendering time: " + (double)elapsedTime * 0.001 + "[s]");
                    Log.d(TAG, "HandlingFrame time: " + (double)(stopTime - handleFrameTime) * 0.001 + "[s]");
                    measure_started = false;
                }

                mHandler.sendMessage(mHandler.obtainMessage(MSG_STOP_RECORDING));
                mHandler.sendMessage(mHandler.obtainMessage(MSG_QUIT));
                return;
            }
            else if (timestamp == 0) {
                // Seeing this after device is toggled off/on with power button.  The
                // first frame back has a zero timestamp.
                //
                // MPEG4Writer thinks this is cause to abort() in native code, so it's very
                // important that we just ignore the frame.
                Log.w(TAG, "HEY: got SurfaceTexture with timestamp of zero");
                return;
            }
            // this is workaround for duplicated timestamp
            // might cause troubles with some videos
            else if ((timestamp == last_orig_timestamp)) {
                return;
            }
            else {
                frame++;

                mHandler.sendMessage(mHandler.obtainMessage(MSG_FRAME_AVAILABLE,
                        (int) (actTimestamp >> 32), (int) actTimestamp, transform));

                timestampCorected = actTimestamp + timestampCorrection;
                mHandler.sendMessage(mHandler.obtainMessage(MSG_FRAME_AVAILABLE,
                        (int) (timestampCorected >> 32), (int) timestampCorected, transform));

                actTimestamp += step;
            }
            last_orig_timestamp = timestamp;
        }
    }

    /**
     * Calculates 'average' diffrence between frames.
     * Result is based on first 50 frames.
     * Shuld be called in frameAvailiable.
     *
     * @param timestamp actual frame timestamp
     */
    private void calcAndShowAverageDiff(long timestamp) {
        if ((frame < 50) && (frame > 0)) {
            average_diff += timestamp - last_timestamp;
            last_timestamp = timestamp;
        }
        if (frame == 50) {
            average_diff /= frame;
            Log.d(TAG, "Average timestamp difference: " + Long.toString(average_diff));
        }
    }

    /**
     * Tells the video recorder what texture name to use.  This is the external texture that
     * we're receiving camera previews in.  (Call from non-encoder thread.)
     * <p>
     * TODOO: do something less clumsy
     */
    public void setTextureId(int id) {
        synchronized (mReadyFence) {
            if (!mReady) {
                return;
            }
        }
        mHandler.sendMessage(mHandler.obtainMessage(MSG_SET_TEXTURE_ID, id, 0, null));
    }

    /**
     * Encoder thread entry point.  Establishes Looper/Handler and waits for messages.
     * <p>
     * @see java.lang.Thread#run()
     */
    @Override
    public void run() {
        // Establish a Looper for this thread, and define a Handler for it.
        Looper.prepare();
        synchronized (mReadyFence) {
            mHandler = new EncoderHandler(this);
            mReady = true;
            mReadyFence.notify();
        }
        Looper.loop();

        Log.d(TAG, "Encoder thread exiting");
        synchronized (mReadyFence) {
            mReady = mRunning = false;
            mHandler = null;
        }
    }

    /**
     * Handles encoder state change requests.  The handler is created on the encoder thread.
     */
    private static class EncoderHandler extends Handler {
        private WeakReference<TextureMovieEncoder> mWeakEncoder;

        public EncoderHandler(TextureMovieEncoder encoder) {
            mWeakEncoder = new WeakReference<TextureMovieEncoder>(encoder);
        }

        @Override  // runs on encoder thread
        public void handleMessage(Message inputMessage) {
            int what = inputMessage.what;
            Object obj = inputMessage.obj;

            TextureMovieEncoder encoder = mWeakEncoder.get();
            if (encoder == null) {
                Log.w(TAG, "EncoderHandler.handleMessage: encoder is null");
                return;
            }

            switch (what) {
                case MSG_START_RECORDING:
                    encoder.handleStartRecording((EncoderConfig) obj);
                    break;
                case MSG_STOP_RECORDING:
                    encoder.handleStopRecording();
                    break;
                case MSG_FRAME_AVAILABLE:
                    long timestamp = (((long) inputMessage.arg1) << 32) |
                            (((long) inputMessage.arg2) & 0xffffffffL);
                    encoder.handleFrameAvailable((float[]) obj, timestamp);
                    break;
                case MSG_SET_TEXTURE_ID:
                    encoder.handleSetTexture(inputMessage.arg1);
                    break;
                case MSG_UPDATE_SHARED_CONTEXT:
                    encoder.handleUpdateSharedContext((EGLContext) inputMessage.obj);
                    break;
                case MSG_QUIT:
                    Looper.myLooper().quit();
                    break;
                default:
                    throw new RuntimeException("Unhandled msg what=" + what);
            }
        }
    }

    /**
     * Starts recording.
     */
    private void handleStartRecording(EncoderConfig config) {
        Log.d(TAG, "handleStartRecording " + config);
        prepareEncoder(config.mEglContext, config.mWidth, config.mHeight, config.mBitRate,
                config.mOutputFile);
    }

    /**
     * Handles notification of an available frame.
     * <p>
     * The texture is rendered onto the encoder's input surface, along with a moving
     * box (just because we can).
     * <p>
     * @param transform The texture transform, from SurfaceTexture.
     * @param timestampNanos The frame's timestamp, from SurfaceTexture.
     */
    private void handleFrameAvailable(float[] transform, long timestampNanos) {
        if (VERBOSE) Log.d(TAG, "handleFrameAvailable tr=" + transform);
        if (cycle == 1) {
            mVideoEncoder.drainEncoder(false);
            mFullScreen.drawFrame(mTextureId, transform, 1.0f);
        }
        else {
            mFullScreen.drawFrame(mTextureId, transform, -1.0f);
        }

        mInputWindowSurface.setPresentationTime(timestampNanos);
        mInputWindowSurface.swapBuffers();

        if (cycle == 1) {
            mSpeedCallback.setCanRelease(true);
            cycle = 0;
        } else
            cycle++;
    }

    /**
     * Handles a request to stop encoding.
     */
    private void handleStopRecording() {
        Log.d(TAG, "handleStopRecording");
        mVideoEncoder.drainEncoder(true);
        releaseEncoder();
    }

    /**
     * Sets the texture name that SurfaceTexture will use when frames are received.
     */
    private void handleSetTexture(int id) {
        //Log.d(TAG, "handleSetTexture " + id);
        mTextureId = id;
    }

    /**
     * Tears down the EGL surface and context we've been using to feed the MediaCodec input
     * surface, and replaces it with a new one that shares with the new context.
     * <p>
     * This is useful if the old context we were sharing with went away (maybe a GLSurfaceView
     * that got torn down) and we need to hook up with the new one.
     */
    private void handleUpdateSharedContext(EGLContext newSharedContext) {
        Log.d(TAG, "handleUpdatedSharedContext " + newSharedContext);

        // Release the EGLSurface and EGLContext.
        mInputWindowSurface.releaseEglSurface();
        mFullScreen.release(false);
        mEglCore.release();

        // Create a new EGLContext and recreate the window surface.
        mEglCore = new EglCore(newSharedContext, EglCore.FLAG_RECORDABLE);
        mInputWindowSurface.recreate(mEglCore);
        mInputWindowSurface.makeCurrent();

        // Create new programs and such for the new context.
        mFullScreen = new FullFrameRect(
                new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_SBS));
    }

    private void prepareEncoder(EGLContext sharedContext, int width, int height, int bitRate,
            File outputFile) {
        try {
            mVideoEncoder = new VideoEncoderCore(width, height, bitRate, outputFile);
        } catch (IOException ioe) {
            throw new RuntimeException(ioe);
        }
        mEglCore = new EglCore(sharedContext, EglCore.FLAG_RECORDABLE);
        mInputWindowSurface = new WindowSurface(mEglCore, mVideoEncoder.getInputSurface(), true);
        mInputWindowSurface.makeCurrent();

        mFullScreen = new FullFrameRect(
                new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_SBS));
    }

    private void releaseEncoder() {

        mVideoEncoder.release();
        if (mInputWindowSurface != null) {
            mInputWindowSurface.release();
            mInputWindowSurface = null;
        }
        if (mFullScreen != null) {
            mFullScreen.release(false);
            mFullScreen = null;
        }
        if (mEglCore != null) {
            mEglCore.release();
            mEglCore = null;
        }
    }

    /**
     * Draws a box, with position offset.
     */
    private void drawBox(int posn) {
        final int width = mInputWindowSurface.getWidth();
        int xpos = (posn * 4) % (width - 50);
        GLES20.glEnable(GLES20.GL_SCISSOR_TEST);
        GLES20.glScissor(xpos, 0, 100, 100);
        GLES20.glClearColor(1.0f, 0.0f, 1.0f, 1.0f);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
        GLES20.glDisable(GLES20.GL_SCISSOR_TEST);
    }
}

答案 5 :(得分:1)

你应该{{1}}更大。

我采取这种自由是因为你说要创建一串 20,000个字符&#39 ;;但还有更多。

如果你只是在一个20000个字符的字符串中找到大约20-30个实例,那么就会出错。球场估计是指要测试20000个角色位置,并且每个位置都会有一个由四个字母组成的字母表中的四个字母的字符串,这使得它成为特定字符串的概率为1/256。平均值应该是(大约;因为我已经过度简化)20000/256或78次点击。

可能是你的字符串没有被正确随机化(可能是因为使用了模数成语),或者你可能只测试每四个字符位置 - 好像字符串是一个列表不重叠的四个字母的单词。

如果您可以将平均点击率提高到78,那么只需按比例增加{{1}}即可达到100点要求。