我已经在Android上编译了FFmpeg (libffmpeg.so)。现在我必须构建一个应用程序,如RockPlayer或使用现有的Android多媒体框架来调用FFmpeg。

你有在Android / StageFright上集成FFmpeg的步骤/过程/代码/示例吗? 你能指导我如何使用这个库来播放多媒体吗? 我有一个需求,我已经有音频和视频传输流,我需要将其馈送到FFmpeg并将其解码/渲染。我怎么能在Android上做到这一点,因为IOMX api是基于OMX的,不能在这里插件FFmpeg ? 我也找不到FFmpeg api的文档,需要用于回放。


以下是我在让ffmpeg在Android上工作时所经历的步骤:

Build static libraries of ffmpeg for Android. This was achieved by building olvaffe's ffmpeg android port (libffmpeg) using the Android Build System. Simply place the sources under /external and make away. You'll need to extract bionic(libc) and zlib(libz) from the Android build as well, as ffmpeg libraries depend on them. Create a dynamic library wrapping ffmpeg functionality using the Android NDK. There's a lot of documentation out there on how to work with the NDK. Basically you'll need to write some C/C++ code to export the functionality you need out of ffmpeg into a library java can interact with through JNI. The NDK allows you to easily link against the static libraries you've generated in step 1, just add a line similar to this to Android.mk: LOCAL_STATIC_LIBRARIES := libavcodec libavformat libavutil libc libz Use the ffmpeg-wrapping dynamic library from your java sources. There's enough documentation on JNI out there, you should be fine.

关于使用ffmpeg进行回放,有很多例子(ffmpeg二进制本身就是一个很好的例子),这里是一个基本教程。最好的文档可以在标题中找到。

祝你好运。


我做了一个小项目,使用Android NDK配置和构建X264和FFMPEG。主要缺少的是一个像样的JNI接口,可以通过Java访问它,但这是比较容易的部分(相对而言)。当我腾出时间使JNI接口适合我自己使用时,我将把它推进去。

与olvaffe的构建系统相比,它的优势在于不需要Android。Mk文件来构建库,它只使用常规的makefiles和工具链。这使得当你从FFMPEG或X264中做出新的改变时,它不太可能停止工作。

https://github.com/halfninja/android-ffmpeg-x264


由于各种原因,多媒体在不影响效率的情况下完成任务从来都不容易。Ffmpeg每天都在努力改进它。它支持不同格式的编解码器和容器。

现在要回答如何使用这个库的问题,我想说在这里编写它并不是那么简单。但我可以用以下方法指导你。

1)在源代码的ffmpeg目录中,有output_example.c或api_example.c。在这里,您可以看到编码/解码完成的代码。您将了解应该在ffmpeg中调用哪个API。这是你的第一步。

2) Dolphin player是一个面向Android的开源项目。目前它有bug,但开发人员正在持续工作。在该项目中,您已经准备好了整个设置,您可以使用它继续您的调查。这里有一个来自code.google.com的项目链接,或者在终端中运行命令“git clone https://code.google.com/p/dolphin-player/”。您可以看到两个名为P和P86的项目。你可以用它们中的任何一个。

我想提供的额外提示是,当您构建ffmpeg代码时,在build.sh中,您需要启用您想使用的格式的muxers/demuxers/编码器/解码器。否则相应的代码将不包含在库中。我花了很长时间才意识到这一点。所以我想和你分享。

基本要素: 当我们说一个视频文件,例如:avi,它是音频和视频的组合

视频文件=视频+音频


视频=编解码器+ Muxer + Demuxer

codec =编码器+解码器

=>视频=编码器+解码器+ Muxer + Demuxer(Mpeg4 + Mpeg4 +avi +avi - avi容器的示例)


音频=编解码器+ Muxer + Demuxer

codec =编码器+解码器

=>音频=编码器+解码器+ Muxer + Demuxer(mp2 + mp2 + avi + avi - avi容器示例)


编解码器(名称来源于en*co*der/*dec*oder的组合)只是格式的一部分,它定义了用于编码/解码帧的算法。AVI不是一个编解码器,它是一个使用Mpeg4的视频编解码器和mp2的音频编解码器的容器。

Muxer/demuxer用于在编码/解码时从文件中合并/分离帧。

所以如果你想使用avi格式,你需要启用视频组件+音频组件。

Ex,对于avi,您需要启用以下功能。 mpeg4编码器,mpeg4解码器,mp2编码器,mp2解码器,avi muxer, avi demuxer。

噗嗤...

以编程方式build.sh应该包含以下代码:

--enable-muxer=avi --enable-demuxer=avi (Generic for both audio/video. generally Specific to a container)
--enable-encoder=mpeg4 --enable-decoder=mpeg4(For video support)
--enable-encoder=mp2 --enable-decoder=mp2 (For Audio support)

希望在这之后我没有让你更困惑…

谢谢,需要任何帮助,请告诉我。


我发现最容易构建、最容易使用的实现是由guardianproject团队制作的:https://github.com/guardianproject/android-ffmpeg


受到Android上其他FFmpeg实现(主要是guadianproject)的启发,我找到了一个解决方案(也有Lame支持)。

(lame和FFmpeg: https://github.com/intervigilium/liblame和http://bambuser.com/opensource)

调用FFmpeg:

new Thread(new Runnable() {

    @Override
    public void run() {

        Looper.prepare();

        FfmpegController ffmpeg = null;

        try {
            ffmpeg = new FfmpegController(context);
        } catch (IOException ioe) {
            Log.e(DEBUG_TAG, "Error loading ffmpeg. " + ioe.getMessage());
        }

        ShellDummy shell = new ShellDummy();
        String mp3BitRate = "192";

        try {
            ffmpeg.extractAudio(in, out, audio, mp3BitRate, shell);
        } catch (IOException e) {
            Log.e(DEBUG_TAG, "IOException running ffmpeg" + e.getMessage());
        } catch (InterruptedException e) {
            Log.e(DEBUG_TAG, "InterruptedException running ffmpeg" + e.getMessage());
        }

        Looper.loop();

    }

}).start();

并处理控制台输出:

private class ShellDummy implements ShellCallback {

    @Override
    public void shellOut(String shellLine) {
        if (someCondition) {
            doSomething(shellLine);
        }
        Utils.logger("d", shellLine, DEBUG_TAG);
    }

    @Override
    public void processComplete(int exitValue) {
        if (exitValue == 0) {
            // Audio job OK, do your stuff: 

                            // i.e.             
                            // write id3 tags,
                            // calls the media scanner,
                            // etc.
        }
    }

    @Override
    public void processNotStartedCheck(boolean started) {
        if (!started) {
                            // Audio job error, as above.
        }
    }
}

奇怪的是,这个项目没有被提及:Appunite的AndroidFFmpeg

它有相当详细的步骤说明复制/粘贴到命令行,为像我这样的懒人))


为了使我的FFMPEG应用程序,我使用了这个项目(https://github.com/hiteshsondhi88/ffmpeg-android-java),所以,我不需要编译任何东西。我认为这是在我们的Android应用程序中使用FFMPEG的简单方法。

更多信息http://hiteshsondhi88.github.io/ffmpeg-android-java/


我也有同样的问题,我发现这里的大多数答案都过时了。 我最终在FFMPEG上编写了一个包装器,用一行代码从Android访问。

https://github.com/madhavanmalolan/ffmpegandroidlibrary


经过大量的研究,现在这是我发现的最新的Android编译库:

https://github.com/bravobit/FFmpeg-Android

目前使用的是FFmpeg版本n4.0-39-gda39990 包括FFmpeg和FFProbe 包含用于启动命令的Java接口 FFprobe或FFmpeg可以从APK中删除,检查维基https://github.com/bravobit/FFmpeg-Android/wiki


首先,添加FFmpeg库的依赖项

implementation 'com.writingminds:FFmpegAndroid:0.3.2'

然后载入活动

FFmpeg ffmpeg;
    private void trimVideo(ProgressDialog progressDialog) {

    outputAudioMux = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_MOVIES).getAbsolutePath()
            + "/VidEffectsFilter" + "/" + new SimpleDateFormat("ddMMyyyy_HHmmss").format(new Date())
            + "filter_apply.mp4";

    if (startTrim.equals("")) {
        startTrim = "00:00:00";
    }

    if (endTrim.equals("")) {
        endTrim = timeTrim(player.getDuration());
    }

    String[] cmd = new String[]{"-ss", startTrim + ".00", "-t", endTrim + ".00", "-noaccurate_seek", "-i", videoPath, "-codec", "copy", "-avoid_negative_ts", "1", outputAudioMux};


    execFFmpegBinary1(cmd, progressDialog);
    }



    private void execFFmpegBinary1(final String[] command, ProgressDialog prpg) {

    ProgressDialog progressDialog = prpg;

    try {
        ffmpeg.execute(command, new ExecuteBinaryResponseHandler() {
            @Override
            public void onFailure(String s) {
                progressDialog.dismiss();
                Toast.makeText(PlayerTestActivity.this, "Fail to generate video", Toast.LENGTH_SHORT).show();
                Log.d(TAG, "FAILED with output : " + s);
            }

            @Override
            public void onSuccess(String s) {
                Log.d(TAG, "SUCCESS wgith output : " + s);

//                    pathVideo = outputAudioMux;
                String finalPath = outputAudioMux;
                videoPath = outputAudioMux;
                Toast.makeText(PlayerTestActivity.this, "Storage Path =" + finalPath, Toast.LENGTH_SHORT).show();

                Intent intent = new Intent(PlayerTestActivity.this, ShareVideoActivity.class);
                intent.putExtra("pathGPU", finalPath);
                startActivity(intent);
                finish();
                MediaScannerConnection.scanFile(PlayerTestActivity.this, new String[]{finalPath}, new String[]{"mp4"}, null);

            }

            @Override
            public void onProgress(String s) {
                Log.d(TAG, "Started gcommand : ffmpeg " + command);
                progressDialog.setMessage("Please Wait video triming...");
            }

            @Override
            public void onStart() {
                Log.d(TAG, "Startedf command : ffmpeg " + command);

            }

            @Override
            public void onFinish() {
                Log.d(TAG, "Finished f command : ffmpeg " + command);
                progressDialog.dismiss();
            }
        });
    } catch (FFmpegCommandAlreadyRunningException e) {
        // do nothing for now
    }
}

  private void loadFFMpegBinary() {
    try {
        if (ffmpeg == null) {
            ffmpeg = FFmpeg.getInstance(this);
        }
        ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
            @Override
            public void onFailure() {
                showUnsupportedExceptionDialog();
            }

            @Override
            public void onSuccess() {
                Log.d("dd", "ffmpeg : correct Loaded");
            }
        });
    } catch (FFmpegNotSupportedException e) {
        showUnsupportedExceptionDialog();
    } catch (Exception e) {
        Log.d("dd", "EXception no controlada : " + e);
    }
}

private void showUnsupportedExceptionDialog() {
    new AlertDialog.Builder(this)
            .setIcon(android.R.drawable.ic_dialog_alert)
            .setTitle("Not Supported")
            .setMessage("Device Not Supported")
            .setCancelable(false)
            .setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
                @Override
                public void onClick(DialogInterface dialog, int which) {
                    finish();
                }
            })
            .create()
            .show();

}
    public String timeTrim(long milliseconds) {
        String finalTimerString = "";
        String minutString = "";
        String secondsString = "";

        // Convert total duration into time
        int hours = (int) (milliseconds / (1000 * 60 * 60));
        int minutes = (int) (milliseconds % (1000 * 60 * 60)) / (1000 * 60);
        int seconds = (int) ((milliseconds % (1000 * 60 * 60)) % (1000 * 60) / 1000);
        // Add hours if there

        if (hours < 10) {
            finalTimerString = "0" + hours + ":";
        } else {
            finalTimerString = hours + ":";
        }


        if (minutes < 10) {
            minutString = "0" + minutes;
        } else {
            minutString = "" + minutes;
        }

        // Prepending 0 to seconds if it is one digit
        if (seconds < 10) {
            secondsString = "0" + seconds;
        } else {
            secondsString = "" + seconds;
        }

        finalTimerString = finalTimerString + minutString + ":" + secondsString;

        // return timer string
        return finalTimerString;
    }

也可以使用FFmpeg的另一个特性

===> merge audio to video
String[] cmd = new String[]{"-i", yourRealPath, "-i", arrayList.get(posmusic).getPath(), "-map", "1:a", "-map", "0:v", "-codec", "copy", "-shortest", outputcrop};


===> Flip vertical :
String[] cm = new String[]{"-i", yourRealPath, "-vf", "vflip", "-codec:v", "libx264", "-preset", "ultrafast", "-codec:a", "copy", outputcrop1};


===> Flip horizontally :  
String[] cm = new String[]{"-i", yourRealPath, "-vf", "hflip", "-codec:v", "libx264", "-preset", "ultrafast", "-codec:a", "copy", outputcrop1};


===> Rotate 90 degrees clockwise:
String[] cm=new String[]{"-i", yourRealPath, "-c", "copy", "-metadata:s:v:0", "rotate=90", outputcrop1};


===> Compress Video
String[] complexCommand = {"-y", "-i", yourRealPath, "-strict", "experimental", "-vcodec", "libx264", "-preset", "ultrafast", "-crf", "24", "-acodec", "aac", "-ar", "22050", "-ac", "2", "-b", "360k", "-s", "1280x720", outputcrop1};


===> Speed up down video
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=2.0*PTS[v];[0:a]atempo=0.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=1.0*PTS[v];[0:a]atempo=1.0[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=0.75*PTS[v];[0:a]atempo=1.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=0.5*PTS[v];[0:a]atempo=2.0[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};



===> Add two mp3 files 

StringBuilder sb = new StringBuilder();
sb.append("-i ");
sb.append(textSngname);
sb.append(" -i ");
sb.append(mAudioFilename);
sb.append(" -filter_complex [0:0][1:0]concat=n=2:v=0:a=1[out] -map [out] ");
sb.append(finalfile);
---> ffmpeg.execute(sb.toString().split(" "), new ExecuteBinaryResponseHandler()




===> Add three mp3 files

StringBuilder sb = new StringBuilder();
sb.append("-i ");
sb.append(firstSngname);
sb.append(" -i ");
sb.append(textSngname);
sb.append(" -i ");
sb.append(mAudioFilename);
sb.append(" -filter_complex [0:0][1:0][2:0]concat=n=3:v=0:a=1[out] -map [out] ");
sb.append(finalfile);
---> ffmpeg.execute(sb.toString().split(" "), new ExecuteBinaryResponseHandler()