Spark Streaming Setup Notes

Doug Chang

References:

Goal: create a spark streaming program using the Twitter feed.

This is one use case where Spark did a better job than Hadoop in packaging the components for data acquisition. Subscribe to the twitter developer api and show a stream of data being processsed by spark.

Debug Step 1: verify you can subscribe to the twitter feed. The developer API and subsequent websearches aren't that clear on how to sign up to get both a consumer key/secret and token key/secret. This is the base starting point for the spark tutorials.

Download the Twitter4j api or create a maven project with the twitter 4j jars. There are different versions of the api where the more recent versions have refactored the 4j jar names.

Do not use my keys. I changed the keys below so nobody can copy them and get me introuble with Twitter.

package com.example;

import twitter4j.StallWarning;

import twitter4j.Status;

import twitter4j.StatusDeletionNotice;

import twitter4j.StatusListener;

importtwitter4j.Twitter;

import twitter4j.TwitterException;

importtwitter4j.TwitterFactory;

import twitter4j.TwitterStream;

import twitter4j.TwitterStreamFactory;

importtwitter4j.auth.Authorization;

importtwitter4j.conf.Configuration;

importtwitter4j.conf.ConfigurationBuilder;

publicclass TestTwitterLogin {

publicstaticvoid main(String []args) throws IllegalStateException, TwitterException{

System.setProperty("twitter4j.oauth.consumerKey", "eFIaiOuxsny01VVQ2QWasdf");

System.setProperty("twitter4j.oauth.consumerSecret", "gDQI5EiCMJJaaNI8XVNhfZXwuCOYfeJ3XsOUNHvsXqgq0asdf");

System.setProperty("twitter4j.oauth.accessToken", "76976448-Otz8w4yMKx6yCEWTH3dNTfuF8LYeLgqdoDrasdf");

System.setProperty("twitter4j.oauth.accessTokenSecret", "NFPFe2EzuKWuzRKmY1RENUBfQzGeGbAS1JzjX3asdf");

TwitterStream twitterStream = new TwitterStreamFactory().getInstance();

StatusListener listener = new StatusListener() {

@Override

publicvoid onStatus(Status status) {

System.out.println("------STATUS------");

System.out.println("@" + status.getUser().getScreenName() + " - " + status.getText());

}

@Override

publicvoid onDeletionNotice(StatusDeletionNotice statusDeletionNotice) {

System.out.println("------ONDELETIONNOTICE------");

System.out.println("Got a status deletion notice id:" + statusDeletionNotice.getStatusId());

}

@Override

publicvoid onTrackLimitationNotice(int numberOfLimitedStatuses) {

System.out.println("------LIMITATION------");

System.out.println("Got track limitation notice:" + numberOfLimitedStatuses);

}

@Override

publicvoid onScrubGeo(long userId, long upToStatusId) {

System.out.println("Got scrub_geo event userId:" + userId + " upToStatusId:" + upToStatusId);

}

publicvoid onStallWarning(StallWarning warning) {

System.out.println("Got stall warning:" + warning);

}

@Override

publicvoid onException(Exception ex) {

ex.printStackTrace();

}

};

twitterStream.addListener(listener);

twitterStream.sample();

}

}

projectxmlns="

modelVersion4.0.0</modelVersion

groupIdcom.example</groupId

artifactIdTestJavaTwitter4J</artifactId

version0.0.1-SNAPSHOT</version

build

sourceDirectorysrc</sourceDirectory

plugins

plugin

artifactIdmaven-compiler-plugin</artifactId

version3.1</version

configuration

source1.7</source

target1.7</target

</configuration

</plugin

</plugins

</build

dependencies

dependency

groupIdorg.twitter4j</groupId

artifactIdtwitter4j-stream</artifactId

version4.0.1</version

</dependency

dependency

groupIdorg.twitter4j</groupId

artifactIdtwitter4j-examples</artifactId

version4.0.1</version

</dependency

</dependencies

</project

Verify you can get a twitter printout to prove your api subscription is working:


Debug Step #2a: Verify you can create a simple spark program using SBT. We will use this to verify we can print out the twitter stream. Use as a starting point the documentation on the spark website for creating simple.sbt:

dc@localhost spark]$ cat simple.sbt

name := "Simple Project"

version := "1.0"

scalaVersion:="2.10.3"

libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"

resolvers += "Akka Repository" at "

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.3.0"

[dc@localhost spark]$ ls src/main/scala/SimpleApp.scala

src/main/scala/SimpleApp.scala

[dc@localhost spark]$ cat src/main/scala/SimpleApp.scala

/*** SimpleApp.scala ***/

import org.apache.spark.SparkContext

import org.apache.spark.SparkContext._

object SimpleApp {

def main(args: Array[String]) {

val logFile = "/usr/lib/spark/README.md" // Should be some file on your system

val sc = new SparkContext("local", "Simple App", "/usr/lib/spark",

List("target/scala-2.10/simple-project_2.10-1.0.jar"))

val logData = sc.textFile(logFile, 2).cache()

val numAs = logData.filter(line => line.contains("a")).count()

val numBs = logData.filter(line => line.contains("b")).count()

println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))

}

}

Output:

If you are using the bigtop distribution then you have to add sudo.

Run:

>sudo sbt/sbt run

[dc@localhost spark]$ sudo sbt/sbt run

Launching sbt from sbt/sbt-launch-0.12.4.jar

[info] Loading project definition from /usr/lib/spark/project

[info] Set current project to Simple Project (in build file:/usr/lib/spark/)

[info] Running SimpleApp

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/usr/lib/spark/lib/slf4j-log4j12-1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/root/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

14/06/24 13:53:11 INFO Utils: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

14/06/24 13:53:11 WARN Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 192.168.171.1 instead (on interface vmnet8)

14/06/24 13:53:11 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address

14/06/24 13:53:12 INFO Slf4jLogger: Slf4jLogger started

14/06/24 13:53:12 INFO Remoting: Starting remoting

14/06/24 13:53:12 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://:45736]

14/06/24 13:53:12 INFO Remoting: Remoting now listens on addresses: [akka.tcp://:45736]

14/06/24 13:53:12 INFO SparkEnv: Registering BlockManagerMaster

14/06/24 13:53:13 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20140624135313-32ec

14/06/24 13:53:13 INFO MemoryStore: MemoryStore started with capacity 640.0 MB.

14/06/24 13:53:13 INFO ConnectionManager: Bound socket to port 34930 with id = ConnectionManagerId(192.168.171.1,34930)

14/06/24 13:53:13 INFO BlockManagerMaster: Trying to register BlockManager

14/06/24 13:53:13 INFO BlockManagerMasterActor$BlockManagerInfo: Registering block manager 192.168.171.1:34930 with 640.0 MB RAM

14/06/24 13:53:13 INFO BlockManagerMaster: Registered BlockManager

14/06/24 13:53:13 INFO HttpServer: Starting HTTP Server

14/06/24 13:53:13 INFO HttpBroadcast: Broadcast server started at

14/06/24 13:53:13 INFO SparkEnv: Registering MapOutputTracker

14/06/24 13:53:13 INFO HttpFileServer: HTTP File server directory is /tmp/spark-06c01eda-da6d-4dd3-af93-3e441204cea0

14/06/24 13:53:13 INFO HttpServer: Starting HTTP Server

14/06/24 13:53:13 INFO SparkUI: Started Spark Web UI at

14/06/24 13:53:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

14/06/24 13:53:14 INFO SparkContext: Added JAR target/scala-2.10/simple-project_2.10-1.0.jar at with timestamp 1403643194173

14/06/24 13:53:14 INFO MemoryStore: ensureFreeSpace(152694) called with curMem=0, maxMem=671101747

14/06/24 13:53:14 INFO MemoryStore: Block broadcast_0 stored as values to memory (estimated size 149.1 KB, free 639.9 MB)

14/06/24 13:53:14 INFO FileInputFormat: Total input paths to process : 1

14/06/24 13:53:14 INFO SparkContext: Starting job: count at SimpleApp.scala:11

14/06/24 13:53:14 INFO DAGScheduler: Got job 0 (count at SimpleApp.scala:11) with 2 output partitions (allowLocal=false)

14/06/24 13:53:14 INFO DAGScheduler: Final stage: Stage 0 (count at SimpleApp.scala:11)

14/06/24 13:53:14 INFO DAGScheduler: Parents of final stage: List()

14/06/24 13:53:14 INFO DAGScheduler: Missing parents: List()

14/06/24 13:53:14 INFO DAGScheduler: Submitting Stage 0 (FilteredRDD[2] at filter at SimpleApp.scala:11), which has no missing parents

14/06/24 13:53:14 INFO DAGScheduler: Submitting 2 missing tasks from Stage 0 (FilteredRDD[2] at filter at SimpleApp.scala:11)

14/06/24 13:53:14 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks

14/06/24 13:53:14 INFO TaskSetManager: Starting task 0.0:0 as TID 0 on executor localhost: localhost (PROCESS_LOCAL)

14/06/24 13:53:14 INFO TaskSetManager: Serialized task 0.0:0 as 1690 bytes in 6 ms

14/06/24 13:53:14 INFO Executor: Running task ID 0

14/06/24 13:53:14 INFO Executor: Fetching with timestamp 1403643194173

14/06/24 13:53:14 INFO Utils: Fetching to /tmp/fetchFileTemp8920244719454374851.tmp

14/06/24 13:53:15 INFO Executor: Adding file:/tmp/spark-14342d4f-45d7-41ce-a902-61dbb6983db3/simple-project_2.10-1.0.jar to class loader

14/06/24 13:53:15 INFO BlockManager: Found block broadcast_0 locally

14/06/24 13:53:15 INFO CacheManager: Partition rdd_1_0 not found, computing it

14/06/24 13:53:15 INFO HadoopRDD: Input split: file:/usr/lib/spark/README.md:0+51

14/06/24 13:53:15 INFO MemoryStore: ensureFreeSpace(384) called with curMem=152694, maxMem=671101747

14/06/24 13:53:15 INFO MemoryStore: Block rdd_1_0 stored as values to memory (estimated size 384.0 B, free 639.9 MB)

14/06/24 13:53:15 INFO BlockManagerMasterActor$BlockManagerInfo: Added rdd_1_0 in memory on 192.168.171.1:34930 (size: 384.0 B, free: 640.0 MB)

14/06/24 13:53:15 INFO BlockManagerMaster: Updated info of block rdd_1_0

14/06/24 13:53:15 INFO Executor: Serialized size of result for 0 is 563

14/06/24 13:53:15 INFO Executor: Sending result for 0 directly to driver

14/06/24 13:53:15 INFO Executor: Finished task ID 0

14/06/24 13:53:15 INFO TaskSetManager: Starting task 0.0:1 as TID 1 on executor localhost: localhost (PROCESS_LOCAL)

14/06/24 13:53:15 INFO TaskSetManager: Serialized task 0.0:1 as 1690 bytes in 1 ms

14/06/24 13:53:15 INFO Executor: Running task ID 1

14/06/24 13:53:15 INFO BlockManager: Found block broadcast_0 locally

14/06/24 13:53:15 INFO TaskSetManager: Finished TID 0 in 250 ms on localhost (progress: 1/2)

14/06/24 13:53:15 INFO CacheManager: Partition rdd_1_1 not found, computing it

14/06/24 13:53:15 INFO HadoopRDD: Input split: file:/usr/lib/spark/README.md:51+51

14/06/24 13:53:15 INFO MemoryStore: ensureFreeSpace(544) called with curMem=153078, maxMem=671101747

14/06/24 13:53:15 INFO MemoryStore: Block rdd_1_1 stored as values to memory (estimated size 544.0 B, free 639.9 MB)

14/06/24 13:53:15 INFO DAGScheduler: Completed ResultTask(0, 0)

14/06/24 13:53:15 INFO BlockManagerMasterActor$BlockManagerInfo: Added rdd_1_1 in memory on 192.168.171.1:34930 (size: 544.0 B, free: 640.0 MB)

14/06/24 13:53:15 INFO BlockManagerMaster: Updated info of block rdd_1_1

14/06/24 13:53:15 INFO Executor: Serialized size of result for 1 is 563

14/06/24 13:53:15 INFO Executor: Sending result for 1 directly to driver

14/06/24 13:53:15 INFO Executor: Finished task ID 1

14/06/24 13:53:15 INFO DAGScheduler: Completed ResultTask(0, 1)

14/06/24 13:53:15 INFO TaskSetManager: Finished TID 1 in 24 ms on localhost (progress: 2/2)

14/06/24 13:53:15 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool

14/06/24 13:53:15 INFO DAGScheduler: Stage 0 (count at SimpleApp.scala:11) finished in 0.292 s

14/06/24 13:53:15 INFO SparkContext: Job finished: count at SimpleApp.scala:11, took 0.458186941 s

14/06/24 13:53:15 INFO SparkContext: Starting job: count at SimpleApp.scala:12

14/06/24 13:53:15 INFO DAGScheduler: Got job 1 (count at SimpleApp.scala:12) with 2 output partitions (allowLocal=false)

14/06/24 13:53:15 INFO DAGScheduler: Final stage: Stage 1 (count at SimpleApp.scala:12)

14/06/24 13:53:15 INFO DAGScheduler: Parents of final stage: List()

14/06/24 13:53:15 INFO DAGScheduler: Missing parents: List()

14/06/24 13:53:15 INFO DAGScheduler: Submitting Stage 1 (FilteredRDD[3] at filter at SimpleApp.scala:12), which has no missing parents

14/06/24 13:53:15 INFO DAGScheduler: Submitting 2 missing tasks from Stage 1 (FilteredRDD[3] at filter at SimpleApp.scala:12)

14/06/24 13:53:15 INFO TaskSchedulerImpl: Adding task set 1.0 with 2 tasks

14/06/24 13:53:15 INFO TaskSetManager: Starting task 1.0:0 as TID 2 on executor localhost: localhost (PROCESS_LOCAL)

14/06/24 13:53:15 INFO TaskSetManager: Serialized task 1.0:0 as 1693 bytes in 0 ms

14/06/24 13:53:15 INFO Executor: Running task ID 2

14/06/24 13:53:15 INFO BlockManager: Found block broadcast_0 locally

14/06/24 13:53:15 INFO BlockManager: Found block rdd_1_0 locally

14/06/24 13:53:15 INFO Executor: Serialized size of result for 2 is 563

14/06/24 13:53:15 INFO Executor: Sending result for 2 directly to driver

14/06/24 13:53:15 INFO Executor: Finished task ID 2

14/06/24 13:53:15 INFO TaskSetManager: Starting task 1.0:1 as TID 3 on executor localhost: localhost (PROCESS_LOCAL)

14/06/24 13:53:15 INFO TaskSetManager: Serialized task 1.0:1 as 1693 bytes in 0 ms

14/06/24 13:53:15 INFO Executor: Running task ID 3

14/06/24 13:53:15 INFO DAGScheduler: Completed ResultTask(1, 0)

14/06/24 13:53:15 INFO TaskSetManager: Finished TID 2 in 9 ms on localhost (progress: 1/2)

14/06/24 13:53:15 INFO BlockManager: Found block broadcast_0 locally

14/06/24 13:53:15 INFO BlockManager: Found block rdd_1_1 locally

14/06/24 13:53:15 INFO Executor: Serialized size of result for 3 is 563

14/06/24 13:53:15 INFO Executor: Sending result for 3 directly to driver

14/06/24 13:53:15 INFO Executor: Finished task ID 3

14/06/24 13:53:15 INFO DAGScheduler: Completed ResultTask(1, 1)

14/06/24 13:53:15 INFO DAGScheduler: Stage 1 (count at SimpleApp.scala:12) finished in 0.017 s

14/06/24 13:53:15 INFO TaskSetManager: Finished TID 3 in 9 ms on localhost (progress: 2/2)

14/06/24 13:53:15 INFO SparkContext: Job finished: count at SimpleApp.scala:12, took 0.028886185 s

14/06/24 13:53:15 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool

Lines with a: 8, Lines with b: 5

14/06/24 13:53:15 INFO ConnectionManager: Selector thread was interrupted!

[success] Total time: 5 s, completed Jun 24, 2014 1:53:15 PM

[dc@localhost spark]$

The input file is:

[dc@localhost spark]$ cat README.md

thsi si a test

this is a test for lines with a

a

a

a

a

and for the letter b

b

b

b

another b with a

Notes: bigtop doesn't install sbt, you will have to install sbt under /usr/lib/spark to be like the spark distribution. There is a build.properies error message:

[dc@localhost spark]$ sbt/sbt run

awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for reading (No such file or directory)

Launching sbt from sbt/sbt-launch-.jar

Error: Invalid or corrupt jarfile sbt/sbt-launch-.jar

Create a project directory and copy build.properties from the source distribution or create your own. Update the sbt version inside build.properties to the version of sbt you are using. SBT goes and fetches this version before starting. Will probably still work if you don't do this.

[dc@localhost spark]$ cat project/build.properties

#

# Licensed to the Apache Software Foundation (ASF) under one or more

# contributor license agreements. See the NOTICE file distributed with

# this work for additional information regarding copyright ownership.

# The ASF licenses this file to You under the Apache License, Version 2.0

# (the "License"); you may not use this file except in compliance with

# the License. You may obtain a copy of the License at

#

#

#

# Unless required by applicable law or agreed to in writing, software

# distributed under the License is distributed on an "AS IS" BASIS,

# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.

# See the License for the specific language governing permissions and

# limitations under the License.

#

sbt.version=0.13.5

Debug Step #2b: modify the above to support streaming.

Modify the sbt file to include dependencies:

[dc@localhost spark]$ cat simple.sbt

name := "Simple Project"

version := "1.0"

scalaVersion:="2.10.3"

libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"

resolvers += "Akka Repository" at "

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.3.0"

libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "0.9.1"

libraryDependencies += "org.twitter4j" % "twitter4j-stream" % "3.0.3"

Modify the SimpleApp.scala file to add streaming and print out the tweets. Copy the TwitterUtil scala files and TwitterDStream class files from the 0.9.1 source code or impor the external Twitter jar.

My source dir looks like:

[dc@localhost spark]$ ls src/main/scala/

SimpleApp.scala TwitterInputDStream.scala TwitterUtils.scala

[dc@localhost spark]$

[dc@localhost spark]$ cat src/main/scala/SimpleApp.scala

/*** SimpleApp.scala ***/

import org.apache.spark.SparkContext

import org.apache.spark.SparkContext._

import org.apache.spark.streaming._

object SimpleApp {

def main(args: Array[String]) {

val logFile = "/usr/lib/spark/README.md" // Should be some file on your system

//val sc = new SparkContext("local", "Simple App", "/usr/lib/spark",

// List("target/scala-2.10/simple-project_2.10-1.0.jar"))

//val logData = sc.textFile(logFile, 2).cache()

//val numAs = logData.filter(line => line.contains("a")).count()

//val numBs = logData.filter(line => line.contains("b")).count()

//println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))

System.setProperty("twitter4j.oauth.consumerKey", "eFIaiOuxsny01VVQ2QWISK1Mw")

System.setProperty("twitter4j.oauth.consumerSecret", "gDQI5EiCMJJaaNI8XVNhfZXwuCOYfeJ3XsOUNHvsXqgq0Hoj9T")

System.setProperty("twitter4j.oauth.accessToken", "76976448-Otz8w4yMKx6yCEWTH3dNTfuF8LYeLgqdoDrcl0oBK")

System.setProperty("twitter4j.oauth.accessTokenSecret", "NFPFe2EzuKWuzRKmY1RENUBfQzGeGbAS1JzjX3Eu3GwDE")

//scaladocs not accurate, follow holden's exampple

//spark://192.168.171.1:7077

val stream = new StreamingContext("local","Simple App", Seconds(1))

val tweets= TwitterUtils.createStream(stream,None)

tweets.print()

stream.start()

println("+++++++++++++++++++++++++++++");

stream.awaitTermination()

//sc.stop()

}

}

Output:

こんな楽しいゲームはこれが初めて!!! source='<a href=" rel="nofollow">ネットワーク50</a>', isTruncated=false, inReplyToStatusId=-1, inReplyToUserId=-1, isFavorited=false, inReplyToScreenName='null', geoLocation=null, place=null, retweetCount=97, isPossiblySensitive=false, contributorsIDs=[J@ef3bee1, retweetedStatus=null, userMentionEntities=[], urlEntities=[URLEntityJSONImpl{url=' expandedURL=' displayURL='urx.nu/9CfR'}], hashtagEntities=[], mediaEntities=[MediaEntityJSONImpl{id=481602776594513920, url= mediaURL= mediaURLHttps= expandedURL= displayURL='pic.twitter.com/JqhnaWj8Gj', sizes={0=Size{width=150, height=150, resize=101}, 1=Size{width=340, height=408, resize=100}, 2=Size{width=500, height=600, resize=100}, 3=Size{width=500, height=600, resize=100}}, type=photo}], currentUserRetweetId=-1, user=UserJSONImpl{id=2515434492, name='【公式】大人気ゲームアプリ速報 ', screenName='ninkiap', location='', description='オススメの無料アプリや知って得するアプリ情報をアップしていきます。

知ってる人はRTしてみんなに教えてあげよ♪', isContributorsEnabled=false, profileImageUrl=' profileImageUrlHttps=' url='null', isProtected=false, followersCount=103, status=null, profileBackgroundColor='C0DEED', profileTextColor='333333', profileLinkColor='0084B4', profileSidebarFillColor='DDEEF6', profileSidebarBorderColor='C0DEED', profileUseBackgroundImage=true, showAllInlineMedia=false, friendsCount=167, createdAt=Thu May 22 06:29:24 PDT 2014, favouritesCount=0, utcOffset=32400, timeZone='Irkutsk', profileBackgroundImageUrl=' profileBackgroundImageUrlHttps=' profileBackgroundTiled=false, lang='ja', statusesCount=1824, isGeoEnabled=false, isVerified=false, translator=false, listedCount=1, isFollowRequestSent=false}}, userMentionEntities=[UserMentionEntityJSONImpl{name='【公式】大人気ゲームアプリ速報 ', screenName='ninkiap', id=2515434492}], urlEntities=[URLEntityJSONImpl{url=' expandedURL=' displayURL='urx.nu/9CfR'}], hashtagEntities=[], mediaEntities=[MediaEntityJSONImpl{id=481602776594513920, url= mediaURL= mediaURLHttps= expandedURL= displayURL='pic.twitter.com/JqhnaWj8Gj', sizes={0=Size{width=150, height=150, resize=101}, 1=Size{width=340, height=408, resize=100}, 2=Size{width=500, height=600, resize=100}, 3=Size{width=500, height=600, resize=100}}, type=photo}], currentUserRetweetId=-1, user=UserJSONImpl{id=2515560282, name='花言葉bot', screenName='hanakotobaw', location='', description='花言葉をつぶやきます。 誰かにお花を贈る時などに参考になれば嬉しいです。 知らなかったらRTお願いします。', isContributorsEnabled=false, profileImageUrl=' profileImageUrlHttps=' url='null', isProtected=false, followersCount=52, status=null, profileBackgroundColor='C0DEED', profileTextColor='333333', profileLinkColor='0084B4', profileSidebarFillColor='DDEEF6', profileSidebarBorderColor='C0DEED', profileUseBackgroundImage=true, showAllInlineMedia=false, friendsCount=446, createdAt=Thu May 22 07:31:15 PDT 2014, favouritesCount=0, utcOffset=32400, timeZone='Irkutsk', profileBackgroundImageUrl=' profileBackgroundImageUrlHttps=' profileBackgroundTiled=false, lang='ja', statusesCount=223, isGeoEnabled=false, isVerified=false, translator=false, listedCount=1, isFollowRequestSent=false}}

StatusJSONImpl{createdAt=Tue Jun 24 18:00:45 PDT 2014, id=481602865991929857, text='規制垢【@ syara04282】からの無言フォローお許しください <BoT>', source='<a href=" rel="nofollow">autotweety.net</a>', isTruncated=false, inReplyToStatusId=-1, inReplyToUserId=-1, isFavorited=false, inReplyToScreenName='null', geoLocation=null, place=null, retweetCount=0, isPossiblySensitive=false, contributorsIDs=[J@2478ad72, retweetedStatus=null, userMentionEntities=[], urlEntities=[], hashtagEntities=[], mediaEntities=[], currentUserRetweetId=-1, user=UserJSONImpl{id=1249201381, name='紗螺@しゃららん', screenName='syara0428', location='室内', description='名前はしゃらです。 fate/弾丸論破/FF/マギ/銀魂…etcが好き。なりきりさんや関係の一般さんと絡む用。

アイコン頂き物ヘッダートレスにつき持ち帰りはご遠慮ください', isContributorsEnabled=false, profileImageUrl=' profileImageUrlHttps=' url=' isProtected=false, followersCount=281, status=null, profileBackgroundColor='0099B9', profileTextColor='3C3940', profileLinkColor='0099B9', profileSidebarFillColor='95E8EC', profileSidebarBorderColor='5ED4DC', profileUseBackgroundImage=true, showAllInlineMedia=false, friendsCount=288, createdAt=Thu Mar 07 06:32:19 PST 2013, favouritesCount=12857, utcOffset=32400, timeZone='Irkutsk', profileBackgroundImageUrl=' profileBackgroundImageUrlHttps=' profileBackgroundTiled=false, lang='ja', statusesCount=79682, isGeoEnabled=false, isVerified=false, translator=false, listedCount=9, isFollowRequestSent=false}}

StatusJSONImpl{createdAt=Tue Jun 24 18:00:45 PDT 2014, id=481602865987727361, text='RT @AnfasA2014: هي أربع ؟ , خيرٌ لك وأبقى سبحان الله ، و الحمد لله ولا إله إلا الله ، و الله أكبر~', source='<a href=" rel="nofollow">Twitter for iPhone</a>', isTruncated=false, inReplyToStatusId=-1, inReplyToUserId=-1, isFavorited=false, inReplyToScreenName='null', geoLocation=null, place=null, retweetCount=0, isPossiblySensitive=false, contributorsIDs=[J@6ec32a12, retweetedStatus=StatusJSONImpl{createdAt=Tue Jun 24 17:18:29 PDT 2014, id=481592229014691841, text='هي أربع ؟ , خيرٌ لك وأبقى سبحان الله ، و الحمد لله ولا إله إلا الله ، و الله أكبر~', source='<a href=" rel="nofollow">Twitter for BlackBerry®</a>', isTruncated=false, inReplyToStatusId=-1, inReplyToUserId=-1, isFavorited=false, inReplyToScreenName='null', geoLocation=null, place=null, retweetCount=3, isPossiblySensitive=false, contributorsIDs=[J@4eff3c8d, retweetedStatus=null, userMentionEntities=[], urlEntities=[], hashtagEntities=[], mediaEntities=[], currentUserRetweetId=-1, user=UserJSONImpl{id=2291801757, name='انفاسـ الجنوبـ', screenName='AnfasA2014', location=' ~ تحتـ رحمة الله~', description='يارب اذا حضرتني الوفاة فسخر لي من يلقني الشهادتين واطلق بها لساني واختم لي خاتمةحسنة وسهل علي سكرات الموت وادخلني مع عبادك الصالحينـ~', isContributorsEnabled=false, profileImageUrl=' profileImageUrlHttps=' url='null', isProtected=false, followersCount=4127, status=null, profileBackgroundColor='C0DEED', profileTextColor='333333', profileLinkColor='0084B4', profileSidebarFillColor='DDEEF6', profileSidebarBorderColor='C0DEED', profileUseBackgroundImage=true, showAllInlineMedia=false, friendsCount=4478, createdAt=Sun Jan 19 06:55:15 PST 2014, favouritesCount=537, utcOffset=-1, timeZone='null', profileBackgroundImageUrl=' profileBackgroundImageUrlHttps=' profileBackgroundTiled=false, lang='ar', statusesCount=9958, isGeoEnabled=true, isVerified=false, translator=false, listedCount=4, isFollowRequestSent=false}}, userMentionEntities=[UserMentionEntityJSONImpl{name='انفاسـ الجنوبـ', screenName='AnfasA2014', id=2291801757}], urlEntities=[], hashtagEntities=[], mediaEntities=[], currentUserRetweetId=-1, user=UserJSONImpl{id=2560181521, name='فرووحـہَ', screenName='Froo7a_1981', location='', description='لَا إِلَهَ إِلَّا أَنْتَ سُبْحَانَكَ إِنِّي كُنْتُ مِنَ الظَّالِمِينَ', isContributorsEnabled=false, profileImageUrl=' profileImageUrlHttps=' url='null', isProtected=false, followersCount=1200, status=null, profileBackgroundColor='C0DEED', profileTextColor='333333', profileLinkColor='0084B4', profileSidebarFillColor='DDEEF6', profileSidebarBorderColor='C0DEED', profileUseBackgroundImage=true, showAllInlineMedia=false, friendsCount=1843, createdAt=Tue Jun 10 17:11:42 PDT 2014, favouritesCount=48, utcOffset=-1, timeZone='null', profileBackgroundImageUrl=' profileBackgroundImageUrlHttps=' profileBackgroundTiled=false, lang='ar', statusesCount=241, isGeoEnabled=false, isVerified=false, translator=false, listedCount=0, isFollowRequestSent=false}}

StatusJSONImpl{createdAt=Tue Jun 24 18:00:45 PDT 2014, id=481602866012905473, text='Can I move out of this state and disconnect my phone ?', source='<a href=" rel="nofollow">Twitter for iPhone</a>', isTruncated=false, inReplyToStatusId=-1, inReplyToUserId=-1, isFavorited=false, inReplyToScreenName='null', geoLocation=null, place=null, retweetCount=0, isPossiblySensitive=false, contributorsIDs=[J@56e0757f, retweetedStatus=null, userMentionEntities=[], urlEntities=[], hashtagEntities=[], mediaEntities=[], currentUserRetweetId=-1, user=UserJSONImpl{id=1942450526, name='Cheyenne De Jarnett', screenName='cheydejarnett', location='', description='Happy. Living life to the fullest. Fresno State Bound', isContributorsEnabled=false, profileImageUrl=' profileImageUrlHttps=' url='null', isProtected=false, followersCount=156, status=null, profileBackgroundColor='C0DEED', profileTextColor='333333', profileLinkColor='0084B4', profileSidebarFillColor='DDEEF6', profileSidebarBorderColor='C0DEED', profileUseBackgroundImage=true, showAllInlineMedia=false, friendsCount=164, createdAt=Sun Oct 06 17:38:42 PDT 2013, favouritesCount=1097, utcOffset=-1, timeZone='null', profileBackgroundImageUrl=' profileBackgroundImageUrlHttps=' profileBackgroundTiled=false, lang='en', statusesCount=1553, isGeoEnabled=false, isVerified=false, translator=false, listedCount=0, isFollowRequestSent=false}}