Android

Table of Contents

1 Connect Android to PC using debian and udev

  • Get the code of the constructor
    • either get the code from http://developer.android.com/tools/device.html
    • or find the line [.] usb .: New USB device found, idVendor=04e8, . in the output of dmesg when you plug the phone. (for the example, my samsung phone indicates the vendor 04e8)
  • create an udev rule for the android phone when it plugs in
    • edit the new file /etc/udev/rules.d/50-android.rules and put into it
      SUBSYSTEM=="usb", ATTR{idVendor}=="04e8", MODE="0666", GROUP="plugdev"
    • restart udev with

      service udev restart
      
  • re plug the phone

2 Use android as sound server

nil

2.1 With SoundWire   ATTACH

  • install SoundWire from the Market
  • install the program from the web site
  • Install pavucontrol
  • Launch the program on the computer
  • Launch the program on the phone
  • Follow the content of the README.txt file in the computer program

2.2 With ffserver and alsa

Taken from https://bbs.archlinux.org/viewtopic.php?pid=1165169, see also http://www.alsa-project.org/main/index.php/Asoundrc#Virtual_multi_channel_devices for more precise explanations.
Load the alsa module redirecting sound to a loop device:

modprobe snd-aloop pcm_substreams=2

Create ALSA configuration file and store it either to ~/.asoundrc (just for user) or to /etc/asound.conf as system wide:

   pcm.!default {
  type asym
  playback.pcm "LoopAndReal"
  capture.pcm "hw:0,0"
  hint {
    show on
    description "Default with loopback"
  }
}

#"type plug" is mandatory to convert sample type
pcm.LoopAndReal {
  type plug
  slave.pcm mdev
  route_policy "duplicate"
  hint {
    show on
    description "LoopAndReal"
  }
}

pcm.mdev {
  type multi
  slaves.a.pcm pcm.MixReal
  slaves.a.channels 2
  slaves.b.pcm pcm.MixLoopback
  slaves.b.channels 2
  bindings.0.slave a
  bindings.0.channel 0
  bindings.1.slave a
  bindings.1.channel 1
  bindings.2.slave b
  bindings.2.channel 0
  bindings.3.slave b
  bindings.3.channel 1
}

pcm.MixReal {
  type dmix
  ipc_key 1024
  slave {
    pcm "hw:0,0"
    #rate 48000
    #rate 44100
    #periods 128
    #period_time 0
    #period_size 1024 # must be power of 2
    #buffer_size 8192
  }
}

pcm.MixLoopback {
  type dmix
  ipc_key 1025
  slave {
    pcm "hw:Loopback,0,0"
    #rate 48000
    #rate 44100
    #periods 128
    #period_time 0
    #period_size 1024 # must be power of 2
    #buffer_size 8192
  }
}

You can play with sample rates and buffer sizes you you have any problem. This configuration works on my system.
Prepare ffserver configuration and store either to default location /etc/ffserver.conf as system wide setup or anywhere to your home:

   # Port on which the server is listening. You must select a different
# port from your standard HTTP web server if it is running on the same
# computer.
Port 8090

# Address on which the server is bound. Only useful if you have
# several network interfaces.
BindAddress 0.0.0.0

# Number of simultaneous HTTP connections that can be handled. It has
# to be defined *before* the MaxClients parameter, since it defines the
# MaxClients maximum limit.
MaxHTTPConnections 2000

# Number of simultaneous requests that can be handled. Since FFServer
# is very fast, it is more likely that you will want to leave this high
# and use MaxBandwidth, below.
MaxClients 1000

# This the maximum amount of kbit/sec that you are prepared to
# consume when streaming to clients.
MaxBandwidth 1000

# Access log file (uses standard Apache log file format)
# '-' is the standard output.
CustomLog -

# Suppress that if you want to launch ffserver as a daemon.
NoDaemon


##################################################################
# Definition of the live feeds. Each live feed contains one video
# and/or audio sequence coming from an ffmpeg encoder or another
# ffserver. This sequence may be encoded simultaneously with several
# codecs at several resolutions.

<Feed feed1.ffm>

# You must use 'ffmpeg' to send a live feed to ffserver. In this
# example, you can type:
#
# ffmpeg http://localhost:8090/feed1.ffm

# ffserver can also do time shifting. It means that it can stream any
# previously recorded live stream. The request should contain:
# "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify
# a path where the feed is stored on disk. You also specify the
# maximum size of the feed, where zero means unlimited. Default:
# File=/tmp/feed_name.ffm FileMaxSize=5M
File /tmp/feed1.ffm
FileMaxSize 200K

# You could specify
# ReadOnlyFile /saved/specialvideo.ffm
# This marks the file as readonly and it will not be deleted or updated.

# Specify launch in order to start ffmpeg automatically.
# First ffmpeg must be defined with an appropriate path if needed,
# after that options can follow, but avoid adding the http:// field
#Launch ffmpeg

# Only allow connections from localhost to the feed.
#ACL allow 127.0.0.1

</Feed>


##################################################################
# Now you can define each stream which will be generated from the
# original audio and video stream. Each format has a filename (here
# 'test1.mpg'). FFServer will send this stream when answering a
# request containing this filename.

# MP3 audio
<Stream stream.mp3>
Feed feed1.ffm
Format mp2
AudioCodec libmp3lame
AudioBitRate 320
AudioChannels 2
AudioSampleRate 44100
NoVideo
</Stream>


# Ogg Vorbis audio
#<Stream test.ogg>
#Feed feed1.ffm
#Format ogg
#AudioCodec libvorbis
#Title "Stream title"
#AudioBitRate 64
#AudioChannels 2
#AudioSampleRate 44100
#NoVideo
#</Stream>

##################################################################
# Special streams

# Server status

<Stream stat.html>
Format status

# Only allow local people to get the status
ACL allow localhost
ACL allow 192.168.1.0 192.168.1.255

#FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico
</Stream>

# Redirect index.html to the appropriate site
<Redirect index.html>
URL http://www.ffmpeg.org/
</Redirect>

This sets ffserver for streaming in MP3 format, stereo, 320kbps.
Now you have all configuration you need and if you want to stream following two commands do that:

ffserver -f ffserver.conf
ffmpeg -f alsa -ac 2 -i hw:Loopback,1,0 http://localhost:8090/feed1.ffm

You can test it for example by mplayer:

mplayer http://YourLinuxBox:8090/stream.mp3

And that's it. Sound is played by normal sound card and sent to stream simultaneously. If you do not want to listen sound from computer you can mute your soundcard. It has an advantage that one can normally listen to music on the computer with or without streaming and in both cases without any reconfiguration. To start streaming just call ffserver and ffmpeg.
Advantages:

  • very simple solution without any special sound server
  • no special SW required (in my case I had already instaled all I need for that)
  • streaming on request by two simple commands
  • normal soundcard function
  • streaming in MP3 format that is supported by many home AV receivers

Disadvantages:

  • phonon-vlc backend not compatible (also VLC does not work)
  • OGG streaming does not work
  • some latency (~ 5 sec)
  • all sounds are sent to stream, including various desktop notifications (in KDE could be managed by phonon)

Author: konubinix

Created: 2015-11-01 dim. 19:07

Emacs 24.3.50.1 (Org mode 8.0.3)

Validate