我已经在Android上编译了FFmpeg (libffmpeg.so)。现在我必须构建一个应用程序,如RockPlayer或使用现有的Android多媒体框架来调用FFmpeg。

你有在Android / StageFright上集成FFmpeg的步骤/过程/代码/示例吗? 你能指导我如何使用这个库来播放多媒体吗? 我有一个需求,我已经有音频和视频传输流,我需要将其馈送到FFmpeg并将其解码/渲染。我怎么能在Android上做到这一点,因为IOMX api是基于OMX的,不能在这里插件FFmpeg ? 我也找不到FFmpeg api的文档,需要用于回放。


当前回答

奇怪的是,这个项目没有被提及:Appunite的AndroidFFmpeg

它有相当详细的步骤说明复制/粘贴到命令行,为像我这样的懒人))

其他回答

受到Android上其他FFmpeg实现(主要是guadianproject)的启发,我找到了一个解决方案(也有Lame支持)。

(lame和FFmpeg: https://github.com/intervigilium/liblame和http://bambuser.com/opensource)

调用FFmpeg:

new Thread(new Runnable() {

    @Override
    public void run() {

        Looper.prepare();

        FfmpegController ffmpeg = null;

        try {
            ffmpeg = new FfmpegController(context);
        } catch (IOException ioe) {
            Log.e(DEBUG_TAG, "Error loading ffmpeg. " + ioe.getMessage());
        }

        ShellDummy shell = new ShellDummy();
        String mp3BitRate = "192";

        try {
            ffmpeg.extractAudio(in, out, audio, mp3BitRate, shell);
        } catch (IOException e) {
            Log.e(DEBUG_TAG, "IOException running ffmpeg" + e.getMessage());
        } catch (InterruptedException e) {
            Log.e(DEBUG_TAG, "InterruptedException running ffmpeg" + e.getMessage());
        }

        Looper.loop();

    }

}).start();

并处理控制台输出:

private class ShellDummy implements ShellCallback {

    @Override
    public void shellOut(String shellLine) {
        if (someCondition) {
            doSomething(shellLine);
        }
        Utils.logger("d", shellLine, DEBUG_TAG);
    }

    @Override
    public void processComplete(int exitValue) {
        if (exitValue == 0) {
            // Audio job OK, do your stuff: 

                            // i.e.             
                            // write id3 tags,
                            // calls the media scanner,
                            // etc.
        }
    }

    @Override
    public void processNotStartedCheck(boolean started) {
        if (!started) {
                            // Audio job error, as above.
        }
    }
}

我发现最容易构建、最容易使用的实现是由guardianproject团队制作的:https://github.com/guardianproject/android-ffmpeg

奇怪的是,这个项目没有被提及:Appunite的AndroidFFmpeg

它有相当详细的步骤说明复制/粘贴到命令行,为像我这样的懒人))

首先,添加FFmpeg库的依赖项

implementation 'com.writingminds:FFmpegAndroid:0.3.2'

然后载入活动

FFmpeg ffmpeg;
    private void trimVideo(ProgressDialog progressDialog) {

    outputAudioMux = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_MOVIES).getAbsolutePath()
            + "/VidEffectsFilter" + "/" + new SimpleDateFormat("ddMMyyyy_HHmmss").format(new Date())
            + "filter_apply.mp4";

    if (startTrim.equals("")) {
        startTrim = "00:00:00";
    }

    if (endTrim.equals("")) {
        endTrim = timeTrim(player.getDuration());
    }

    String[] cmd = new String[]{"-ss", startTrim + ".00", "-t", endTrim + ".00", "-noaccurate_seek", "-i", videoPath, "-codec", "copy", "-avoid_negative_ts", "1", outputAudioMux};


    execFFmpegBinary1(cmd, progressDialog);
    }



    private void execFFmpegBinary1(final String[] command, ProgressDialog prpg) {

    ProgressDialog progressDialog = prpg;

    try {
        ffmpeg.execute(command, new ExecuteBinaryResponseHandler() {
            @Override
            public void onFailure(String s) {
                progressDialog.dismiss();
                Toast.makeText(PlayerTestActivity.this, "Fail to generate video", Toast.LENGTH_SHORT).show();
                Log.d(TAG, "FAILED with output : " + s);
            }

            @Override
            public void onSuccess(String s) {
                Log.d(TAG, "SUCCESS wgith output : " + s);

//                    pathVideo = outputAudioMux;
                String finalPath = outputAudioMux;
                videoPath = outputAudioMux;
                Toast.makeText(PlayerTestActivity.this, "Storage Path =" + finalPath, Toast.LENGTH_SHORT).show();

                Intent intent = new Intent(PlayerTestActivity.this, ShareVideoActivity.class);
                intent.putExtra("pathGPU", finalPath);
                startActivity(intent);
                finish();
                MediaScannerConnection.scanFile(PlayerTestActivity.this, new String[]{finalPath}, new String[]{"mp4"}, null);

            }

            @Override
            public void onProgress(String s) {
                Log.d(TAG, "Started gcommand : ffmpeg " + command);
                progressDialog.setMessage("Please Wait video triming...");
            }

            @Override
            public void onStart() {
                Log.d(TAG, "Startedf command : ffmpeg " + command);

            }

            @Override
            public void onFinish() {
                Log.d(TAG, "Finished f command : ffmpeg " + command);
                progressDialog.dismiss();
            }
        });
    } catch (FFmpegCommandAlreadyRunningException e) {
        // do nothing for now
    }
}

  private void loadFFMpegBinary() {
    try {
        if (ffmpeg == null) {
            ffmpeg = FFmpeg.getInstance(this);
        }
        ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
            @Override
            public void onFailure() {
                showUnsupportedExceptionDialog();
            }

            @Override
            public void onSuccess() {
                Log.d("dd", "ffmpeg : correct Loaded");
            }
        });
    } catch (FFmpegNotSupportedException e) {
        showUnsupportedExceptionDialog();
    } catch (Exception e) {
        Log.d("dd", "EXception no controlada : " + e);
    }
}

private void showUnsupportedExceptionDialog() {
    new AlertDialog.Builder(this)
            .setIcon(android.R.drawable.ic_dialog_alert)
            .setTitle("Not Supported")
            .setMessage("Device Not Supported")
            .setCancelable(false)
            .setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
                @Override
                public void onClick(DialogInterface dialog, int which) {
                    finish();
                }
            })
            .create()
            .show();

}
    public String timeTrim(long milliseconds) {
        String finalTimerString = "";
        String minutString = "";
        String secondsString = "";

        // Convert total duration into time
        int hours = (int) (milliseconds / (1000 * 60 * 60));
        int minutes = (int) (milliseconds % (1000 * 60 * 60)) / (1000 * 60);
        int seconds = (int) ((milliseconds % (1000 * 60 * 60)) % (1000 * 60) / 1000);
        // Add hours if there

        if (hours < 10) {
            finalTimerString = "0" + hours + ":";
        } else {
            finalTimerString = hours + ":";
        }


        if (minutes < 10) {
            minutString = "0" + minutes;
        } else {
            minutString = "" + minutes;
        }

        // Prepending 0 to seconds if it is one digit
        if (seconds < 10) {
            secondsString = "0" + seconds;
        } else {
            secondsString = "" + seconds;
        }

        finalTimerString = finalTimerString + minutString + ":" + secondsString;

        // return timer string
        return finalTimerString;
    }

也可以使用FFmpeg的另一个特性

===> merge audio to video
String[] cmd = new String[]{"-i", yourRealPath, "-i", arrayList.get(posmusic).getPath(), "-map", "1:a", "-map", "0:v", "-codec", "copy", "-shortest", outputcrop};


===> Flip vertical :
String[] cm = new String[]{"-i", yourRealPath, "-vf", "vflip", "-codec:v", "libx264", "-preset", "ultrafast", "-codec:a", "copy", outputcrop1};


===> Flip horizontally :  
String[] cm = new String[]{"-i", yourRealPath, "-vf", "hflip", "-codec:v", "libx264", "-preset", "ultrafast", "-codec:a", "copy", outputcrop1};


===> Rotate 90 degrees clockwise:
String[] cm=new String[]{"-i", yourRealPath, "-c", "copy", "-metadata:s:v:0", "rotate=90", outputcrop1};


===> Compress Video
String[] complexCommand = {"-y", "-i", yourRealPath, "-strict", "experimental", "-vcodec", "libx264", "-preset", "ultrafast", "-crf", "24", "-acodec", "aac", "-ar", "22050", "-ac", "2", "-b", "360k", "-s", "1280x720", outputcrop1};


===> Speed up down video
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=2.0*PTS[v];[0:a]atempo=0.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=1.0*PTS[v];[0:a]atempo=1.0[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=0.75*PTS[v];[0:a]atempo=1.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=0.5*PTS[v];[0:a]atempo=2.0[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};



===> Add two mp3 files 

StringBuilder sb = new StringBuilder();
sb.append("-i ");
sb.append(textSngname);
sb.append(" -i ");
sb.append(mAudioFilename);
sb.append(" -filter_complex [0:0][1:0]concat=n=2:v=0:a=1[out] -map [out] ");
sb.append(finalfile);
---> ffmpeg.execute(sb.toString().split(" "), new ExecuteBinaryResponseHandler()




===> Add three mp3 files

StringBuilder sb = new StringBuilder();
sb.append("-i ");
sb.append(firstSngname);
sb.append(" -i ");
sb.append(textSngname);
sb.append(" -i ");
sb.append(mAudioFilename);
sb.append(" -filter_complex [0:0][1:0][2:0]concat=n=3:v=0:a=1[out] -map [out] ");
sb.append(finalfile);
---> ffmpeg.execute(sb.toString().split(" "), new ExecuteBinaryResponseHandler()

我也有同样的问题,我发现这里的大多数答案都过时了。 我最终在FFMPEG上编写了一个包装器,用一行代码从Android访问。

https://github.com/madhavanmalolan/ffmpegandroidlibrary