Fred Wilson:

What you might miss, and I missed until recently, is that Tweetstorming has some unique characteristics, which I outlined in my storm, that make it different and possibly better in some respects.

What you also might have missed: tweet storms are annoying as hell, in that they clog up the receiver's timeline with a bunch of stuff they may or may not care about, and as a bonus are difficult to read in sequence / context.

Get a tumblr. Post an essay. Post a link to essay on Twitter. I hate that "let's monopolize all my follower's timelines" is gaining legitimacy by giving it a name.

Creating a Bucket Policy:

You use the AWS Policy Generator to generate a Bucket Policy. There are several examples online and Amazon has a ton of examples.

Allow viewing & downloading of S3 objects directly via a browser. If, for example, you send attachments via email and also want to link to them on S3.

AWS is so full of its own jargon at this point - set an S3 bucket policy and ACL, but make sure you have the right IAM key, and also CORS doesn't conflict. And that's just for S3. Don't even look at Route 51.

Everything you know about Unix server administration has been reinvented on AWS with slightly different names and parameters, so none of the knowledge is transferrable outside of Amazon's infrastructure. Heavy lock-in. I don't like it.

Time comparison with ActiveSupport failed

now = Time.zone.now => Wed, 19 Feb 2014 21:30:56 UTC +00:00 Time.zone.at(now.to_i) => Wed, 19 Feb 2014 21:30:56 UTC +00:00 now == Time.zone.at(now.to_i) => false

How is it possible?

Upd:

Time.zone.at(now.to_i).to_i == now.to_i => true

Selected Answer (from jvperrin)

Ruby tracks time down to the nanosecond:

now = Time.zone.now => Wed, 19 Feb 2014 21:30:56 UTC +00:00 Time.zone.at(now.to_f) => Wed, 19 Feb 2014 21:30:56 UTC +00:00 now == Time.zone.at(now.to_f) => false

But if you compare the nanoseconds, you will see they are not the same, even when creating the time object using the float value, because the float value used to create the new time object is not as accurate as the nanosecond value of the time:

now.nsec => 956134961 Time.zone.at(now.to_f).nsec => 956134796

Got hit with this while testing.

From the docs:

params = ActionController::Parameters.new({
  person: {
    name: 'Francesco',
    age:  22,
    role: 'admin'
  }
})

permitted = params.require(:person).permit(:name, :age)
permitted            # => {"name"=>"Francesco", "age"=>22}
permitted.class      # => ActionController::Parameters
permitted.permitted? # => true

Chainable method calls are wrong here. It's okay for doing the most basic possible forms, using all the Rails defaults, but for anything else it's crap. See bullshit like this - and that's for sanctioned nested_attributes type calls.

Let's take a simple example:

params = ActionController::Parameters.new(first: true, second: {first_hash: 1, second_hash: 2, third_hash: 3})

How do we get the values of both :first and :second ? If you know in advance what all the possible keys for :second are, you can do this:

params.permit(:first, :second => [:first_hash, :second_hash, :third_hash])

If you don't know exactly which keys are there, or want to use logic based on what keys are present... well, no. Definitely not.

If you're doing doing something like tags, lists, or other things that don't nicely map to one parameter => single-level hash, you're in for a rough ride. If you only want to allow certain attributes to be edited by certain users, you need a hacky workaround like building up a giant array or hash before calling .permit . If you want to call params.permit in a before_filter, get lost. If you want to save Javascript logs or something else with a totally arbitrary structure, go die in a fire.

A better model would be to pass a schema to StrongParameters. Imagine if we could call something like this:

def user_params
  params.schema(
    user: {
      email: String,
      tags: [Array, String],
      happiness_level: Numeric,
      preferences: {
        remember_me: Boolean,
        email_me: Boolean
      },
      js_analytics: JSON,
      js_events: Array,
      js_logs: Hash
    }
  )
end

This would be way nicer - you could specify conversions so you don't get strings where you expect numbers or booleans. You could specify what sub-attributes to allow on a hash or array, or just use the raw class if you want to sort it out yourself. And, this theoretical schema method would just return a new Parameters instance with the schema applied - if you wanted to get different attributes earlier or later in the request, just call params.schema again.

I've been developing with Rails for ten years now. If I find this crap difficult, I can't imagine what people in coding boot camps must think of it.

Update: the rails_param looks like a good alternative.

Getting real sick of your shit, startups. These marketing emails are not cute. They are (in order) condescending, creepy, and insulting.

Boatbound

Boatbound
No, I didn't forget, but thanks for making me think for a second that I had.

Bitbucket

Bitbucket
Is my repo gonna commit suicide unless I stay with it? Does it need me to love it forever?

Homejoy

Homejoy
Forwaded from a friend, but I got this one too. No, Homejoy. Actually, I can take care of myself without you. You're convienent. When you're not insulting me.

On BitTorrent's upcoming chat app:

To start, users will be able to choose how they use our chat app. If you are porting in contact lists, you have the convenience of signing up with email or with a phone number. You will also have the option to sign up in Incognito mode, using no such information at all.

What we are building for the Alpha will also address users communicating with a trusted source who prefer their communication to be device-to-device (decentralized). This means no hops through any 3rd party servers, and no chance of anything being intercepted.

For users who may prefer to have their metadata obscured, messages will be indirect and routed through a third node. It is all a matter of preference.

Showing users directly how their message is being routed is genius. Using relay servers to obscure metadata is cool, but how would users know which relay servers to trust? Or are they supposed to set up their own?

I'm also curious is mesh networking will play a role here. Really sets up some cool cyberpunk scenarios where a journalist and a source can "meet" by being on the same block or in the same highrise, but without knowing what the other person looks like, their real identity, or exactly where they're at, and being able to chat without the NSA et al being able to pick up the traffic.

The design integrity of your system is far more important than being able to test it any particular layer. Stop obsessing about unit tests, embrace backfilling of tests when you're happy with the design, and strive for overall system clarity as your principle pursuit.

I think it's hilarious that TDD has gone too far for DHH. And I like his thoughts around the matter - test what's important, at the important levels. Don't add dozens of gems and layers of indirection and library magic just for testing.

This series means to teach you everything you need to know to implement any different caching level inside your Rails application. It assumes you know nothing at all about caching in any of its forms. It takes from zero to knowledge to an intermediate level in all areas. If you can't implement caching in your app after reading this then I've failed.

Lovely reference. Thanks, Mr Hawkins!

So, how hard could it be to build my own music player backend? Seems like it would be a matter of solving these things:

  • Use a robust library for audio decoding. How about the same one that VLC uses?
  • Support adding and removing entries on a playlist for gapless playback.
  • Support pause, play, and seek.
  • Per-playlist-item gain adjustment so that perfect loudness compensation can be implemented.
  • Support loudness scanning to make it easy to implement for example ReplayGain.
  • Support playback to a sound device chosen at runtime.
  • Support transcoding audio into another format so a player can implement, for example, HTTP streaming.
  • Give raw access to decoded audio buffers just in case a player wants to do something other than one of the built-in things.
  • Try to get other projects to use it to benefit from code reuse.
  • Make the API generic enough to support other music players and other use cases.
  • Get it packaged into Debian and Ubuntu.
  • Make a blog post about it to increase awareness.

Playing music on a computer is almost as hard as reading text files. At this point, I'd be pretty happy with a player that:

  1. plays all tracks in a folder (since no one can get artist / album / compilation right)
  2. displays things in a list view (album art sucks if you download lots of independant or unpublished music)
  3. syncs locally on my devices - Mac, Windows, and iOS. Dropbox would be a great option here.
  4. plays your music through a browser

The loudness compensation and gain equalizing is not a big deal to me - I prefer to listen to DJ mixes and entire albums, so the next track is rarely going to be mixed differently than the previous, and it's too easy to introduce playback bugs there.

iTunes fails at all of these - syncing with iTunes Match blows, you have to make playlists for every folder, and it loooooves album art (even when most of it is missing). Also, it crashes constantly on windows.

Google Play Music (wtf branding) fails at 1 and 2, and I didn't get far enough to experiment with 3. Seems good enough at 4.

Rdio doesn't let you play your own music, and they sure as hell aren't gonna have the new (free!) album from Illectrix or Savoy's new album Self Predator

Spotify has terrible local vs cloud syncing, and pretty bad playlist management. Basically, if you're not getting all your music from their landfill of pop hits, get lost.

Amazon Music looks like it hits 2, 3 and 4. I have enough playlists at this point that maybe 1 won't be a problem.

Some random, disorganized thoughts after finishing "Good to Great" by Jim Collins. Here's a great slideshare with the Cliff Notes, though they're not going to mean much unless you've read through the examples.

This book seems to avoid just being a collection of survivor bias stories. Every company they researched had a comparison company from the same industry at the same time that didn't become great. They also had a collection of companies that turned great for a little while, but couldn't keep it up. The book is mostly about what differentiates the long-term great companies from the merely-good-to-terrible ones.

Leadership: it's not about strategy or style. Great leaders tend to be quiet, humble, self-effacing, and yet doggedly persistent and nigh-unstoppable once they've made a decision. They take their time, deliberate, have heated discussions with their trusted advisors & friends, then pursue a path like the Terminator.

The right people. You have to start with the right people. The right people agree with your philosophy, are excited about what they're doing, and have it in their character to do the kind of work they do. Skills can be learned; this is about character traits. Once you have the right people, you have to keep acquiring more of the right people. As soon as you hire the wrong people, you are playing a losing game. No compromises here.

You should never have to motivate your people. They should be intrinsically motivated. Your job is to prevent them from getting de-motivated.

Don't fire someone immediately if they don't seem right - they might just not be in the right position.

The Stockdale paradox: confront the brutal facts. Get unfiltered data. See if what you are doing is working. Maintain unwavering faith that your story ends with triumph; that you retire with your company on top of the world. But at the same time, do not translate this into short-term visions of big growth. If you say "we will be this much better by Christmas" you are setting yourself up for failure.

Find your hedgehog concept. This is something your company does that

  1. Really taps into the passions of your people
  2. Drives your economics, however you measure that (revenue per customer, per visit, per sale, etc)
  3. Is something you could plausibly be the best in the world at

This is the core of your company - make sure you hammer & refine this constantly, and say no to anything that is outside this concept. It could be a product or business: for Walgreens it was "the most convienent drugstores." Or it could be a process: for GE it was "building & training the best executive talent."

The flywheel - momentum is a real thing for groups of people. It's very slow to get going, but once it does, it builds and builds. It helps you attract the right people, build resources, and chase the right initiatives. If you change directions constantly, you are not turning the flywheel - you are not building momentum. If you think some new intitiative or acquisition will motivate people, quickly produce big results, and get that momentum started -- you're wrong. It won't. Momentum builds over time, exclusively.

Your company needs to have some reason for existing beyond making money. It needs to have core values that it holds to; only it doesn't matter what those values are. Phillip Morris has core values and holds to them strongly, even though those values including poisoning millions of people.

Overall summary: This book is great. It contains a lot of lessons we've heard before, but it puts them all together into a coherent structure, and brings in real qualitative data to back them up. It's reinforced a lot of the beliefs I had about running a good company, and I'm hoping that's not just selection bias.