Make sure the SurfaceView or associated SurfaceHolder has a valid Surface
My app crashes when try to open OpenGlRtmpActivity
2023-11-01 14:07:05.517 29652-30493 SurfaceManager com.ipcloud.cms E GL already released
2023-11-01 14:07:05.532 29652-30493 AndroidRuntime com.ipcloud.cms E FATAL EXCEPTION: glThread (Ask Studio Bot)
Process: com.ipcloud.cms, PID: 29652
java.lang.IllegalArgumentException: Make sure the SurfaceView or associated SurfaceHolder has a valid Surface
at android.opengl.EGL14._eglCreateWindowSurface(Native Method)
at android.opengl.EGL14.eglCreateWindowSurface(EGL14.java:263)
at com.pedro.encoder.input.gl.SurfaceManager.eglSetup(SurfaceManager.java:155)
at com.pedro.encoder.input.gl.SurfaceManager.eglSetup(SurfaceManager.java:175)
at com.pedro.library.view.OpenGlView.run(OpenGlView.java:171)
at java.lang.Thread.run(Thread.java:764)
My updated OpenGlRtmpActivity
import android.content.Intent;
import android.content.pm.ActivityInfo;
import android.content.res.ColorStateList;
import android.graphics.BitmapFactory;
import android.graphics.Color;
import android.hardware.Camera;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.provider.MediaStore;
import android.util.Log;
import android.view.MotionEvent;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.View;
import android.view.WindowManager;
import android.widget.AdapterView;
import android.widget.ArrayAdapter;
import android.widget.CheckBox;
import android.widget.EditText;
import android.widget.ImageView;
import android.widget.Spinner;
import android.widget.Toast;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AppCompatActivity;
import androidx.appcompat.widget.Toolbar;
import androidx.constraintlayout.motion.widget.MotionLayout;
import androidx.drawerlayout.widget.DrawerLayout;
import androidx.lifecycle.Observer;
import androidx.lifecycle.ViewModelProvider;
import com.google.android.material.dialog.MaterialAlertDialogBuilder;
import com.google.android.material.navigation.NavigationView;
import com.google.android.material.textfield.TextInputLayout;
import com.ipcloud.sharedmodule.utils.NetworkResults;
import com.ipcloud.sharedmodule.utils.SP;
import com.ipcloud.user_df.R;
import com.ipcloud.user_df.api.UserApiUtils;
import com.ipcloud.user_df.databinding.ActivityOpenGlBinding;
import com.ipcloud.user_df.model.streamkey.streamkeylist.Data;
import com.ipcloud.user_df.model.streamkey.streamkeylist.UserStreamKeyListModel;
import com.ipcloud.user_df.model.streamkey.streamkeylist.UserStreamKeyListPostModel;
import com.ipcloud.user_df.repository.UserRepository;
import com.ipcloud.user_df.ui.stream.utils.PathUtils;
import com.ipcloud.user_df.ui.stream.utils.TopSheetFragment;
import com.ipcloud.user_df.viewmodel.UserViewModel;
import com.ipcloud.user_df.viewmodel.UserViewModelFactory;
import com.pedro.encoder.input.gl.SpriteGestureController;
import com.pedro.encoder.input.gl.render.filters.AnalogTVFilterRender;
import com.pedro.encoder.input.gl.render.filters.BasicDeformationFilterRender;
import com.pedro.encoder.input.gl.render.filters.BeautyFilterRender;
import com.pedro.encoder.input.gl.render.filters.BlackFilterRender;
import com.pedro.encoder.input.gl.render.filters.BlurFilterRender;
import com.pedro.encoder.input.gl.render.filters.BrightnessFilterRender;
import com.pedro.encoder.input.gl.render.filters.CartoonFilterRender;
import com.pedro.encoder.input.gl.render.filters.ChromaFilterRender;
import com.pedro.encoder.input.gl.render.filters.CircleFilterRender;
import com.pedro.encoder.input.gl.render.filters.ColorFilterRender;
import com.pedro.encoder.input.gl.render.filters.ContrastFilterRender;
import com.pedro.encoder.input.gl.render.filters.DuotoneFilterRender;
import com.pedro.encoder.input.gl.render.filters.EarlyBirdFilterRender;
import com.pedro.encoder.input.gl.render.filters.EdgeDetectionFilterRender;
import com.pedro.encoder.input.gl.render.filters.ExposureFilterRender;
import com.pedro.encoder.input.gl.render.filters.FireFilterRender;
import com.pedro.encoder.input.gl.render.filters.GammaFilterRender;
import com.pedro.encoder.input.gl.render.filters.GlitchFilterRender;
import com.pedro.encoder.input.gl.render.filters.GreyScaleFilterRender;
import com.pedro.encoder.input.gl.render.filters.HalftoneLinesFilterRender;
import com.pedro.encoder.input.gl.render.filters.Image70sFilterRender;
import com.pedro.encoder.input.gl.render.filters.LamoishFilterRender;
import com.pedro.encoder.input.gl.render.filters.MoneyFilterRender;
import com.pedro.encoder.input.gl.render.filters.NegativeFilterRender;
import com.pedro.encoder.input.gl.render.filters.NoFilterRender;
import com.pedro.encoder.input.gl.render.filters.PixelatedFilterRender;
import com.pedro.encoder.input.gl.render.filters.PolygonizationFilterRender;
import com.pedro.encoder.input.gl.render.filters.RGBSaturationFilterRender;
import com.pedro.encoder.input.gl.render.filters.RainbowFilterRender;
import com.pedro.encoder.input.gl.render.filters.RippleFilterRender;
import com.pedro.encoder.input.gl.render.filters.RotationFilterRender;
import com.pedro.encoder.input.gl.render.filters.SaturationFilterRender;
import com.pedro.encoder.input.gl.render.filters.SepiaFilterRender;
import com.pedro.encoder.input.gl.render.filters.SharpnessFilterRender;
import com.pedro.encoder.input.gl.render.filters.SnowFilterRender;
import com.pedro.encoder.input.gl.render.filters.SwirlFilterRender;
import com.pedro.encoder.input.gl.render.filters.TemperatureFilterRender;
import com.pedro.encoder.input.gl.render.filters.ZebraFilterRender;
import com.pedro.encoder.input.gl.render.filters.object.GifObjectFilterRender;
import com.pedro.encoder.input.gl.render.filters.object.ImageObjectFilterRender;
import com.pedro.encoder.input.gl.render.filters.object.SurfaceFilterRender;
import com.pedro.encoder.input.gl.render.filters.object.TextObjectFilterRender;
import com.pedro.encoder.input.video.CameraHelper;
import com.pedro.encoder.input.video.CameraOpenException;
import com.pedro.encoder.utils.gl.TranslateTo;
import com.pedro.library.rtmp.RtmpCamera1;
import com.pedro.library.view.OpenGlView;
import com.pedro.rtmp.utils.ConnectCheckerRtmp;
import com.skydoves.colorpickerview.ColorPickerView;
import com.skydoves.colorpickerview.listeners.ColorListener;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.Locale;
import java.util.Objects;
import java.util.concurrent.atomic.AtomicInteger;
public class OpenGlRtmpActivity extends AppCompatActivity implements ConnectCheckerRtmp, View.OnClickListener, SurfaceHolder.Callback, View.OnTouchListener, TopSheetFragment.ItemListener {
private ActivityOpenGlBinding binding;
private RtmpCamera1 rtmpCamera1;
private ImageView startStopStream;
private ImageView startStopStreamRecord;
private String currentDateAndTime = "";
private File folder;
private OpenGlView openGlView;
private SpriteGestureController spriteGestureController = new SpriteGestureController();
private static final int IMAGE_REQUESTCODE = 999;
private static final int VIDEO_REQUESTCODE = 998;
private static final int GIF_REQUESTCODE = 997;
private float rotation = 0f;
private TopSheetFragment fragment;
// Options
private DrawerLayout drawerLayout;
private NavigationView navigationView;
private Spinner spResolution, spVideoBitrate, spFps, spAudioBitrate, spSampleRate, spChannel, spStreamKeys;
private CheckBox cbEchoCanceler, cbNoiseSuppressor;
private UserViewModel userViewModel;
private static final int PERMISSION_REQUEST_CAMERA = 1;
private static final int PERMISSION_REQUEST_MICROPHONE = 2;
@Override
protected void onCreate(Bundle savedInstanceState) {
OpenGlRtmpActivity.this.setTheme(com.ipcloud.cms.R.style.User);
super.onCreate(savedInstanceState);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
binding = ActivityOpenGlBinding.inflate(getLayoutInflater());
setContentView(binding.getRoot());
userViewModel = new ViewModelProvider(this, new UserViewModelFactory(new UserRepository(UserApiUtils.INSTANCE.apiInstance()))).get(UserViewModel.class);
fragment = new TopSheetFragment();
if (fragment != null) {
getSupportFragmentManager().beginTransaction().replace(R.id.container, fragment).commit();
fragment.setListener(this);
}
binding.icon.setOnClickListener(view -> {
// start and end motion layout
if (binding.motionLayout.getProgress() == 0f) {
binding.motionLayout.transitionToEnd();
} else {
binding.motionLayout.transitionToStart();
}
});
binding.motionLayout.setTransitionListener(new MotionLayout.TransitionListener() {
@Override
public void onTransitionTrigger(MotionLayout motionLayout, int i, boolean b, float v) {
}
@Override
public void onTransitionStarted(MotionLayout motionLayout, int i, int i1) {
}
@Override
public void onTransitionChange(MotionLayout motionLayout, int i, int i1, float v) {
binding.icon.animate().rotationBy(180f);
}
@Override
public void onTransitionCompleted(MotionLayout motionLayout, int i) {
}
});
folder = PathUtils.getRecordPath();
openGlView = findViewById(R.id.surfaceView);
startStopStream = findViewById(R.id.b_start_stop);
startStopStream.setOnClickListener(this);
startStopStreamRecord = findViewById(R.id.b_record);
startStopStreamRecord.setOnClickListener(this);
ImageView switchCamera = findViewById(R.id.switch_camera);
switchCamera.setOnClickListener(this);
rtmpCamera1 = new RtmpCamera1(openGlView, this);
openGlView.getHolder().addCallback(this);
openGlView.setOnTouchListener(this);
prepareOptionsMenuViews();
updateLiveDateAndTime();
}
private void updateLiveDateAndTime() {
new Thread(() -> {
while (true) {
try {
Thread.sleep(1000);
runOnUiThread(() -> {
currentDateAndTime = new SimpleDateFormat("dd-MM-yyyy HH:mm:ss", Locale.getDefault()).format(new Date());
binding.dateAndTimeTV.setText(currentDateAndTime);
});
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}).start();
}
private void prepareOptionsMenuViews() {
Toolbar toolbar = findViewById(R.id.myToolbar);
setSupportActionBar(toolbar);
drawerLayout = findViewById(R.id.activity_rtmp);
navigationView = findViewById(R.id.nv_rtmp);
navigationView.inflateMenu(R.menu.options_rtmp);
/* actionBarDrawerToggle = new ActionBarDrawerToggle(this, drawerLayout, toolbar, com.ipcloud.cms.R.string.open_drawer, com.ipcloud.cms.R.string.close_drawer) {
public void onDrawerOpened(View drawerView) {
actionBarDrawerToggle.syncState();
lastVideoBitrate = etVideoBitrate.getText().toString();
}
public void onDrawerClosed(View view) {
actionBarDrawerToggle.syncState();
if (lastVideoBitrate != null && !lastVideoBitrate.equals(etVideoBitrate.getText().toString()) && rtmpCamera1.isStreaming()) {
int bitrate = Integer.parseInt(etVideoBitrate.getText().toString()) * 1024;
rtmpCamera1.setVideoBitrateOnFly(bitrate);
Toast.makeText(OpenGlRtmpActivity.this, "New bitrate: " + bitrate, Toast.LENGTH_SHORT).show();
}
}
};
drawerLayout.addDrawerListener(actionBarDrawerToggle);*/
ImageView menuIV = findViewById(R.id.menuIV);
menuIV.setOnClickListener(v -> drawerLayout.openDrawer(navigationView));
// checkboxs
cbEchoCanceler = (CheckBox) navigationView.getMenu().findItem(R.id.cb_echo_canceler).getActionView();
cbNoiseSuppressor = (CheckBox) navigationView.getMenu().findItem(R.id.cb_noise_suppressor).getActionView();
// spinners
spResolution = (Spinner) navigationView.getMenu().findItem(R.id.sp_resolution).getActionView();
ArrayAdapter<String> resolutionAdapter =
new ArrayAdapter<>(this, android.R.layout.simple_spinner_dropdown_item);
List<String> list = new ArrayList<>();
for (Camera.Size size : rtmpCamera1.getResolutionsBack()) {
list.add(size.width + "X" + size.height);
}
resolutionAdapter.addAll(list);
spResolution.setAdapter(resolutionAdapter);
spVideoBitrate = (Spinner) navigationView.getMenu().findItem(R.id.sp_video_bitrate).getActionView();
spFps = (Spinner) navigationView.getMenu().findItem(R.id.sp_fps).getActionView();
spAudioBitrate = (Spinner) navigationView.getMenu().findItem(R.id.sp_audio_bitrate).getActionView();
spSampleRate = (Spinner) navigationView.getMenu().findItem(R.id.sp_samplerate).getActionView();
spChannel = (Spinner) navigationView.getMenu().findItem(R.id.sp_channel).getActionView();
ArrayAdapter<String> videoBitrateAdapter = new ArrayAdapter<>(this, android.R.layout.simple_spinner_dropdown_item);
videoBitrateAdapter.addAll(getResources().getStringArray(R.array.video_bitrate));
spVideoBitrate.setAdapter(videoBitrateAdapter);
spVideoBitrate.setSelection(1);
ArrayAdapter<String> fpsAdapter = new ArrayAdapter<>(this, android.R.layout.simple_spinner_dropdown_item);
fpsAdapter.addAll(getResources().getStringArray(R.array.fps));
spFps.setAdapter(fpsAdapter);
spFps.setSelection(2);
ArrayAdapter<String> audioBitrateAdapter = new ArrayAdapter<>(this, android.R.layout.simple_spinner_dropdown_item);
audioBitrateAdapter.addAll(getResources().getStringArray(R.array.audio_bitrate));
spAudioBitrate.setAdapter(audioBitrateAdapter);
spAudioBitrate.setSelection(3);
ArrayAdapter<String> sampleRateAdapter = new ArrayAdapter<>(this, android.R.layout.simple_spinner_dropdown_item);
sampleRateAdapter.addAll(getResources().getStringArray(R.array.sample_rate));
spSampleRate.setAdapter(sampleRateAdapter);
spSampleRate.setSelection(5);
ArrayAdapter<String> channelAdapter = new ArrayAdapter<>(this, android.R.layout.simple_spinner_dropdown_item);
channelAdapter.addAll(getResources().getStringArray(R.array.channel));
spChannel.setAdapter(channelAdapter);
spChannel.setSelection(1);
// stream keys
spStreamKeys = (Spinner) navigationView.getMenu().findItem(R.id.sp_stream_keys).getActionView();
ArrayAdapter<String> streamKeysAdapter = new ArrayAdapter<>(this, android.R.layout.simple_spinner_dropdown_item);
UserStreamKeyListPostModel userStreamKeyListPostModel = new UserStreamKeyListPostModel("rtmp");
userViewModel.getUserStreamKeyList(new SP(OpenGlRtmpActivity.this).getToken(), userStreamKeyListPostModel);
Observer<? super NetworkResults<UserStreamKeyListModel>> observer = (Observer<NetworkResults<UserStreamKeyListModel>>) response -> {
if (response.getData() != null) {
UserStreamKeyListModel userStreamKeyListModel = response.getData();
if (userStreamKeyListModel != null) {
if (userStreamKeyListModel.getStatus_code() == 200) {
for (Data data : userStreamKeyListModel.getData()) {
Log.d("MICODERRR", "prepareOptionsMenuViews: " + data.getStreamkey());
streamKeysAdapter.add(data.getStreamkey());
}
spStreamKeys.setAdapter(streamKeysAdapter);
spStreamKeys.setSelection(0);
} else {
Toast.makeText(this, userStreamKeyListModel.getMessage(), Toast.LENGTH_SHORT).show();
}
}
} else {
Toast.makeText(this, "Something went wrong", Toast.LENGTH_SHORT).show();
}
};
userViewModel.getUserStreamKeyListModelLiveData().observe(this, observer);
// Screen orientation
binding.rotateIV.setTag("1");
binding.rotateIV.setOnClickListener(v -> {
if (binding.rotateIV.getTag().equals("0")) {
binding.rotateIV.setTag("1");
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
} else {
binding.rotateIV.setTag("0");
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
}
});
spResolution.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
@Override
public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
startPreview();
}
@Override
public void onNothingSelected(AdapterView<?> parent) {
// startPreview();
}
});
}
private boolean prepareEncoders() {
Camera.Size resolution = rtmpCamera1.getResolutionsBack().get(spResolution.getSelectedItemPosition());
int width = resolution.width;
int height = resolution.height;
return rtmpCamera1.prepareVideo(width, height, Integer.parseInt(spFps.getSelectedItem().toString()),
Integer.parseInt(spVideoBitrate.getSelectedItem().toString()) * 1024,
CameraHelper.getCameraOrientation(this)) && rtmpCamera1.prepareAudio(Integer.parseInt(spAudioBitrate.getSelectedItem().toString()) * 1024,
Integer.parseInt(spSampleRate.getSelectedItem().toString()),
spChannel.getSelectedItem() == getResources().getStringArray(R.array.channel)[1],
cbEchoCanceler.isChecked(),
cbNoiseSuppressor.isChecked());
}
public void startPreview() {
if (prepareEncoders()) {
try {
rtmpCamera1.startPreview();
} catch (CameraOpenException e) {
e.printStackTrace();
Toast.makeText(this, e.getMessage(), Toast.LENGTH_SHORT).show();
}
} else {
Toast.makeText(this, "Error preparing stream, This device cant do it", Toast.LENGTH_SHORT).show();
}
}
@Override
public void onItemClicked(int id) {
switch (id) {
case 999:
setImageToStream();
break;
case 998:
setGifToStream();
break;
case 997:
setTextToStream();
break;
case 996:
setVideoToStream();
break;
case 1:
rtmpCamera1.getGlInterface().setFilter(new NoFilterRender());
break;
case 2:
rtmpCamera1.getGlInterface().setFilter(new AnalogTVFilterRender());
break;
case 3:
rtmpCamera1.getGlInterface().setFilter(new BasicDeformationFilterRender());
break;
case 4:
rtmpCamera1.getGlInterface().setFilter(new BeautyFilterRender());
break;
case 5:
rtmpCamera1.getGlInterface().setFilter(new BlackFilterRender());
break;
case 6:
rtmpCamera1.getGlInterface().setFilter(new BlurFilterRender());
break;
case 7:
rtmpCamera1.getGlInterface().setFilter(new BrightnessFilterRender());
break;
case 9:
rtmpCamera1.getGlInterface().setFilter(new CartoonFilterRender());
break;
case 10:
ChromaFilterRender chromaFilterRender = new ChromaFilterRender();
rtmpCamera1.getGlInterface().setFilter(chromaFilterRender);
chromaFilterRender.setImage(BitmapFactory.decodeResource(getResources(), R.drawable.bg_chroma));
break;
case 11:
rtmpCamera1.getGlInterface().setFilter(new CircleFilterRender());
break;
case 12:
rtmpCamera1.getGlInterface().setFilter(new ColorFilterRender());
break;
case 13:
rtmpCamera1.getGlInterface().setFilter(new ContrastFilterRender());
break;
case 14:
rtmpCamera1.getGlInterface().setFilter(new DuotoneFilterRender());
break;
case 15:
rtmpCamera1.getGlInterface().setFilter(new EarlyBirdFilterRender());
break;
case 16:
rtmpCamera1.getGlInterface().setFilter(new EdgeDetectionFilterRender());
break;
case 17:
rtmpCamera1.getGlInterface().setFilter(new ExposureFilterRender());
break;
case 18:
rtmpCamera1.getGlInterface().setFilter(new FireFilterRender());
break;
case 19:
rtmpCamera1.getGlInterface().setFilter(new GammaFilterRender());
break;
case 20:
rtmpCamera1.getGlInterface().setFilter(new GlitchFilterRender());
break;
case 21:
rtmpCamera1.getGlInterface().setFilter(new GreyScaleFilterRender());
break;
case 22:
rtmpCamera1.getGlInterface().setFilter(new HalftoneLinesFilterRender());
break;
case 23:
rtmpCamera1.getGlInterface().setFilter(new Image70sFilterRender());
break;
case 24:
rtmpCamera1.getGlInterface().setFilter(new LamoishFilterRender());
break;
case 25:
rtmpCamera1.getGlInterface().setFilter(new MoneyFilterRender());
break;
case 26:
rtmpCamera1.getGlInterface().setFilter(new NegativeFilterRender());
break;
case 27:
rtmpCamera1.getGlInterface().setFilter(new PixelatedFilterRender());
break;
case 28:
rtmpCamera1.getGlInterface().setFilter(new PolygonizationFilterRender());
break;
case 29:
rtmpCamera1.getGlInterface().setFilter(new RainbowFilterRender());
break;
case 30:
RGBSaturationFilterRender rgbSaturationFilterRender = new RGBSaturationFilterRender();
rtmpCamera1.getGlInterface().setFilter(rgbSaturationFilterRender);
//Reduce green and blue colors 20%. Red will predominate.
rgbSaturationFilterRender.setRGBSaturation(1f, 0.8f, 0.8f);
break;
case 31:
rtmpCamera1.getGlInterface().setFilter(new RippleFilterRender());
break;
case 32:
RotationFilterRender rotationFilterRender = new RotationFilterRender();
rtmpCamera1.getGlInterface().setFilter(rotationFilterRender);
rotationFilterRender.setRotation(90);
break;
case 33:
rtmpCamera1.getGlInterface().setFilter(new SaturationFilterRender());
break;
case 34:
rtmpCamera1.getGlInterface().setFilter(new SepiaFilterRender());
break;
case 35:
rtmpCamera1.getGlInterface().setFilter(new SharpnessFilterRender());
break;
case 36:
rtmpCamera1.getGlInterface().setFilter(new SnowFilterRender());
break;
case 37:
rtmpCamera1.getGlInterface().setFilter(new SwirlFilterRender());
break;
case 38:
rtmpCamera1.getGlInterface().setFilter(new TemperatureFilterRender());
break;
case 39:
rtmpCamera1.getGlInterface().setFilter(new ZebraFilterRender());
break;
}
}
@Override
protected void onResume() {
super.onResume();
if (fragment != null)
fragment.setListener(this);
}
private void setTextToStream() {
View dialogView = getLayoutInflater().inflate(R.layout.dialog_text_input, null);
EditText editText = dialogView.findViewById(R.id.textTIET);
TextInputLayout textInputLayout = dialogView.findViewById(R.id.textTIL);
ColorPickerView colorPickerView = dialogView.findViewById(R.id.colorPickerView);
AtomicInteger currentColor = new AtomicInteger(Color.parseColor("#FF00FF00"));
colorPickerView.setColorListener((ColorListener) (color, fromUser) -> {
currentColor.set(color);
editText.setTextColor(color);
textInputLayout.setBoxStrokeColor(color);
textInputLayout.setHintTextColor(ColorStateList.valueOf(color));
});
// Create a MaterialAlertDialogBuilder with the custom layout
MaterialAlertDialogBuilder builder = new MaterialAlertDialogBuilder(this).setTitle("Enter text").setIcon(com.ipcloud.cms.R.mipmap.ic_launcher).setCancelable(false).setView(dialogView).setPositiveButton("OK", (dialog, which) -> {
// Get user input from the EditText
String text = editText.getText().toString().trim();
if (!text.isEmpty()) {
// Create a TextObjectFilterRender and set the text
TextObjectFilterRender textObjectFilterRender = new TextObjectFilterRender();
rtmpCamera1.getGlInterface().setFilter(textObjectFilterRender);
textObjectFilterRender.setText(text, 22, currentColor.get());
textObjectFilterRender.setDefaultScale(rtmpCamera1.getStreamWidth(), rtmpCamera1.getStreamHeight());
textObjectFilterRender.setPosition(TranslateTo.CENTER);
spriteGestureController.setBaseObjectFilterRender(textObjectFilterRender); // Optional
}
dialog.dismiss();
}).setNegativeButton("Cancel", (dialog, which) -> dialog.dismiss());
// Show the dialog
builder.show();
}
@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if ((data != null) && (requestCode == IMAGE_REQUESTCODE)) {
try {
ImageObjectFilterRender imageObjectFilterRender = new ImageObjectFilterRender();
rtmpCamera1.getGlInterface().setFilter(imageObjectFilterRender);
imageObjectFilterRender.setImage(BitmapFactory.decodeFile(PathUtils.getPath(this, data.getData())));
imageObjectFilterRender.setPosition(TranslateTo.RIGHT);
spriteGestureController.setBaseObjectFilterRender(imageObjectFilterRender); //Optional
spriteGestureController.setPreventMoveOutside(false); //Optional
} catch (Exception e) {
Toast.makeText(this, e.getMessage(), Toast.LENGTH_SHORT).show();
}
} else if ((data != null) && (requestCode == GIF_REQUESTCODE)) {
try {
GifObjectFilterRender gifObjectFilterRender = new GifObjectFilterRender();
InputStream inputStream = getContentResolver().openInputStream(Objects.requireNonNull(data.getData()));
assert inputStream != null;
gifObjectFilterRender.setGif(inputStream);
rtmpCamera1.getGlInterface().setFilter(gifObjectFilterRender);
gifObjectFilterRender.setDefaultScale(rtmpCamera1.getStreamWidth(), rtmpCamera1.getStreamHeight());
gifObjectFilterRender.setPosition(TranslateTo.BOTTOM);
spriteGestureController.setBaseObjectFilterRender(gifObjectFilterRender); //Optional
} catch (Exception e) {
Toast.makeText(this, e.getMessage(), Toast.LENGTH_SHORT).show();
}
} else if ((data != null) && (requestCode == VIDEO_REQUESTCODE)) {
try {
SurfaceFilterRender surfaceFilterRender = new SurfaceFilterRender(surfaceTexture -> {
//You can render this filter with other api that draw in a surface. for example you can use VLC
MediaPlayer mediaPlayer = MediaPlayer.create(OpenGlRtmpActivity.this, data.getData());
mediaPlayer.setSurface(new Surface(surfaceTexture));
mediaPlayer.start();
});
rtmpCamera1.getGlInterface().setFilter(surfaceFilterRender);
//Video is 360x240 so select a percent to keep aspect ratio (50% x 33.3% screen)
surfaceFilterRender.setScale(50f, 33.3f);
spriteGestureController.setBaseObjectFilterRender(surfaceFilterRender); //Optional
} catch (Exception e) {
Toast.makeText(this, e.getMessage(), Toast.LENGTH_SHORT).show();
}
}
}
private void setImageToStream() {
// select image from gallery
Intent intent = new Intent(Intent.ACTION_PICK, MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivityForResult(intent, IMAGE_REQUESTCODE);
}
private void setGifToStream() {
// select gif only from storage
Intent intent = new Intent(Intent.ACTION_OPEN_DOCUMENT);
intent.addCategory(Intent.CATEGORY_OPENABLE);
intent.setType("image/gif"); // Set MIME type to GIF images
startActivityForResult(intent, GIF_REQUESTCODE);
}
private void setVideoToStream() {
// select video from gallery
Intent intent = new Intent(Intent.ACTION_PICK, MediaStore.Video.Media.EXTERNAL_CONTENT_URI);
startActivityForResult(intent, VIDEO_REQUESTCODE);
}
@Override
public void onConnectionStartedRtmp(String rtmpUrl) {
}
@Override
public void onConnectionSuccessRtmp() {
Toast.makeText(OpenGlRtmpActivity.this, "Connection success", Toast.LENGTH_SHORT).show();
}
@Override
public void onConnectionFailedRtmp(final String reason) {
Toast.makeText(OpenGlRtmpActivity.this, "Connection failed. " + reason, Toast.LENGTH_SHORT).show();
rtmpCamera1.stopStream();
startStopStream.setImageResource(R.drawable.ic_start_record);
binding.menuIV.setVisibility(View.VISIBLE);
binding.rotateIV.setVisibility(View.VISIBLE);
binding.dateAndTimeTV.setVisibility(View.GONE);
}
@Override
public void onNewBitrateRtmp(long bitrate) {
}
@Override
public void onDisconnectRtmp() {
Toast.makeText(OpenGlRtmpActivity.this, "Disconnected", Toast.LENGTH_SHORT).show();
}
@Override
public void onAuthErrorRtmp() {
Toast.makeText(OpenGlRtmpActivity.this, "Auth error", Toast.LENGTH_SHORT).show();
}
@Override
public void onAuthSuccessRtmp() {
Toast.makeText(OpenGlRtmpActivity.this, "Auth success", Toast.LENGTH_SHORT).show();
}
@Override
public void onClick(View view) {
int id = view.getId();
if (id == R.id.b_start_stop) {
if (!rtmpCamera1.isStreaming()) {
if (rtmpCamera1.isRecording() || prepareEncoders()) {
startStopStream.setImageResource(R.drawable.ic_stop_record);
binding.menuIV.setVisibility(View.GONE);
binding.rotateIV.setVisibility(View.GONE);
binding.dateAndTimeTV.setVisibility(View.VISIBLE);
String userName = new SP(OpenGlRtmpActivity.this).getUserName();
String streamKey = "rtmp://ipcloud.live/" + userName + "/" + spStreamKeys.getSelectedItem();
rtmpCamera1.startStream(streamKey);
} else {
Toast.makeText(this, "Error preparing stream, This device cant do it", Toast.LENGTH_SHORT).show();
}
} else {
startStopStream.setImageResource(R.drawable.ic_start_record);
rtmpCamera1.stopStream();
binding.menuIV.setVisibility(View.VISIBLE);
binding.rotateIV.setVisibility(View.VISIBLE);
binding.dateAndTimeTV.setVisibility(View.GONE);
}
} else if (id == R.id.switch_camera) {
try {
rtmpCamera1.switchCamera();
} catch (CameraOpenException e) {
Toast.makeText(this, e.getMessage(), Toast.LENGTH_SHORT).show();
}
} else if (id == R.id.b_record) {
if (!rtmpCamera1.isRecording()) {
try {
if (!folder.exists()) {
folder.mkdir();
}
SimpleDateFormat sdf = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault());
currentDateAndTime = sdf.format(new Date());
if (!rtmpCamera1.isStreaming()) {
if (prepareEncoders()) {
rtmpCamera1.startRecord(folder.getAbsolutePath() + "/" + currentDateAndTime + ".mp4");
startStopStreamRecord.setImageResource(R.drawable.ic_stop_record_file);
Toast.makeText(this, "Recording... ", Toast.LENGTH_SHORT).show();
} else {
Toast.makeText(this, "Error preparing stream, This device cant do it", Toast.LENGTH_SHORT).show();
}
} else {
rtmpCamera1.startRecord(folder.getAbsolutePath() + "/" + currentDateAndTime + ".mp4");
startStopStreamRecord.setImageResource(R.drawable.ic_stop_record_file);
Toast.makeText(this, "Recording... ", Toast.LENGTH_SHORT).show();
}
} catch (IOException e) {
rtmpCamera1.stopRecord();
PathUtils.updateGallery(this, folder.getAbsolutePath() + "/" + currentDateAndTime + ".mp4");
startStopStreamRecord.setImageResource(R.drawable.ic_start_record_file);
Toast.makeText(this, e.getMessage(), Toast.LENGTH_SHORT).show();
}
} else {
rtmpCamera1.stopRecord();
PathUtils.updateGallery(this, folder.getAbsolutePath() + "/" + currentDateAndTime + ".mp4");
startStopStreamRecord.setImageResource(R.drawable.ic_start_record_file);
Toast.makeText(this, "file " + currentDateAndTime + ".mp4 saved in " + folder.getAbsolutePath(), Toast.LENGTH_SHORT).show();
currentDateAndTime = "";
}
}
}
@Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
}
@Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
rtmpCamera1.startPreview();
}
@Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
if (rtmpCamera1.isRecording()) {
rtmpCamera1.stopRecord();
PathUtils.updateGallery(this, folder.getAbsolutePath() + "/" + currentDateAndTime + ".mp4");
startStopStreamRecord.setImageResource(R.drawable.ic_start_record_file);
Toast.makeText(this, "file " + currentDateAndTime + ".mp4 saved in " + folder.getAbsolutePath(), Toast.LENGTH_SHORT).show();
currentDateAndTime = "";
}
if (rtmpCamera1.isStreaming()) {
rtmpCamera1.stopStream();
startStopStream.setImageResource(R.drawable.ic_start_record);
binding.menuIV.setVisibility(View.VISIBLE);
binding.rotateIV.setVisibility(View.VISIBLE);
binding.dateAndTimeTV.setVisibility(View.GONE);
}
rtmpCamera1.stopPreview();
}
@Override
public boolean onTouch(View view, MotionEvent motionEvent) {
if (spriteGestureController.spriteTouched(view, motionEvent)) {
spriteGestureController.moveSprite(view, motionEvent);
spriteGestureController.scaleSprite(motionEvent);
return true;
}
return false;
}
}
Hello,
This is becuase you are using a surface that is not ready to use. Make sure that you are not calling startPreview or startStreaming before of surfaceChanged callback.
public void startPreview() { if (prepareEncoders()) { try { rtmpCamera1.startPreview(); } catch (CameraOpenException e) { e.printStackTrace(); Toast.makeText(this, e.getMessage(), Toast.LENGTH_SHORT).show(); } } else { Toast.makeText(this, "Error preparing stream, This device cant do it", Toast.LENGTH_SHORT).show(); } }
This could be related with this method. Make sure it is not called before surfaceChanged
Solved by using this -> openGlView.getHolder().getSurface().isValid()
if (openGlView.getHolder().getSurface().isValid()) {
if (prepareEncoders()) {
try {
rtmpCamera1.startPreview();
} catch (CameraOpenException e) {
e.printStackTrace();
Toast.makeText(this, e.getMessage(), Toast.LENGTH_SHORT).show();
}
} else {
Toast.makeText(this, "Error preparing stream, This device cant do it", Toast.LENGTH_SHORT).show();
}
}
However the issue not resolves in play store production release, even I didn't use any proguard
Closing as inactive. Reopen issue if needed
Issue still presents, still hasn't solved yet it will be better opened.
Hello,
Ok, we can try refactor a bit the code. I think that the problem is related with startPreview in a wrong moment. You can try remove startPreview from this line:
@Override
public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
startPreview();
}
You can call prepareEncoders but startPreview only should be called in onSurfaceChange and in onSurfaceDestroyed. Also, you can do a sanity check before call startPreview and stopPreview:
if (!rtmpCamera1.isOnPreview()) rtmpCamera1.startPreview();
if (rtmpCamera1.isOnPreview()) rtmpCamera1.stopPreview();