This article is a living document and I’ll update it with new commands and change and fix any of the existing ones as needed.
Start CouchDB in a Docker container
Start CouchDB version 2.3.1 in a Docker container with port 5984 open so it’s available on the host on the same port.
sudo docker run -p 5984:5984 -d couchdb:2.3.1
Start CouchDB version 2.3.1 in a Docker container with all relevant ports open, a node name couchdb-0-project with a volume named volume-0-couchdb mounted to /opt/couchdb/data so CouchDB can use it for permanent storage.
Restart the container based on --name=couchdb0 on subsequent runs with
sudo docker restart couchdb0
Connect to a remote CouchDB that’s only listening on localhost
If your server is example.com, your username on that server is username, your ssh daemon is listening on a non-standard port 1000 (standard is 22), running this command will log you into that ssh server and create a tunnel from the server’s port 5984, where CouchDB normally listens, and your local port 22222.
If you stop and think about what happens when all the plants are gone, it can become obvious. If it’s not, maybe the results of the simulation can shine some light.
With energy source running out, beings closer to the top of the food chain can still survive and make progress for a while, by feeding on those lower in the food chain. Oblivious to their upcoming destiny, however, sooner or later they go extinct too.
Z. Fras, “Artificial life simulation,” M. S. thesis, University of Zagreb, Zagreb, Croatia, 2014.
Is it weird to quote your own thesis? I don’t know, is it? Perhaps.
Let’s pretend that’s fine for a bit, because the data is real, the simulation is real, the title is obvious and the world is, not just metaphorically, on fire, so you’re not here for the answer to the question, you’re here to go deeper. You’re here because you hope there’s more. You’re still reading because you hope we can do something about it and you want to know what your part is. Or that’s my mental image of the kind of person you are.
If it was all black and white, we would end here. Quite literally. We’re killing our forests, our plants are dying, simulations are predicting our demise. We ought to write our eulogies as we look at the smoke raising on the horizon as the setting sun paints beautiful, bloody, hues. Our only hope for life would be the Phoenix of the next civilization raising from these ashes, thinking to themselves “What the heck happened to these guys, they had such advanced technology…”. Because life finds a way, and if it was a movie, it leaves room for a sequel.
Luckily, world has shades of gray, and like the bloody sunset, it has color, visible and invisible spectrum. Simulations are not copies of the real world, and this is our first movie, not the sequel.
Let’s dive in
This is not the only outcome observed in the simulation. Two other, at the first glance, equally gloomy outcomes, appeared frequently.
Beings on top of the food chain exhausted their life sustaining resources, and essentially eradicated the world until there was nothing left but plants. This outcome is not that interesting to us humans. The solution sounds simple, we all just (if by lack of choice) go vegan. Until this scenario turns into the one this article is about. So, in your mind, just prepare to go vegan and bundle them together.
Due to sheer luck of the draw and fortunate initial placements, life arranged itself in favorable position of energy flow that led to overpopulation. So overpopulation is fortunate? Compared to extinction, certainly. Our universe is huge, there is still plenty of space to populate. Beings in the simulation bent the rules of the physics engine in rare fluke cases and escaped the confines of their safe little bubble. Just like them, humanity too is exploring the vicinity of our little bubble. Outside of it, our universe is a very hostile place.
If we bundle the first alternate outcome (or convince ourselves it’s fine to ignore it), we notice how obvious the second one is. We’re making accelerated progress with SpaceX, Blue Origin, NASA’s latest plans, and plenty of other, lesser known companies. The only thing left is to address the main one.
The main topic of this article, what happens when all plants die, is also obvious. Yet, while obvious, it carries inherent dangers.
Have you ever thought about what would happen if there were no more plants? A chain reaction!
While the answer to the question “What follows the extinction of plant life?” is obvious and the dangers seem obvious too, what’s not obvious is that it’s a chain. A food chain.
And change in chains travels as a wave. Those who spend time with me on a daily basis (hello teammates) now smiled a bit, sensing a physics lesson coming up. I’ll keep it light, promise. When something happens on one end of the chain, the effects influence their neighboring link, which influences their neighboring link, which influences their neighboring link… Effects are slowly making their way throughout the whole chain and finally reach the other end of the chain. The longer the chain, the longer it takes for changes to propagate.
But Zvonimir, that’s obvious, everybody can see it coming!
Yes, it’s easy when you’re looking from the side. It’s a lot harder when you’re one of the links in that chain. What makes it even harder is visible in those graphs at the beginning. The chain effect allows those removed from the source of disturbance to continue unaffected and even prosper, oblivious that the ground is disappearing under their feet until the change is there and takes them with it. That’s what happened to the beings on the top of the food chain in the simulation. And it might be already happening to us. All the deforestation and fires certainly suggest so. And we wouldn’t feel the effects right away either.
If we take a look at the number of species extinctions and human population growth correlation, we can pretend that correlation implies causation and that because other species go extinct we can prosper.
Let’s laugh a little and move on from that idea because correlation doesn’t imply causation. Instead, have a look at the graphs from the beginning again. Notice how plant life numbers (green graph) start to drop while second tier life (blue graph) and top tier life (red graph) prosper. It takes plants going bellow a certain threshold for the second tier life to start feeling the effects. It’s the point of no return for these simulated populations. Top tier life continues to prosper for awhile until a lower tier hits a similar threshold. This cascade continues as the effects spread through the chain. The longer the chain, the longer it takes for the top tier to become affected by them.
We know these are not faux correlations implying causation because simulated beings are coded to be dependent on the lower tier beings for food.
If you were to create a graph showing extinctions count and top tier population growth from the results of the simulation, it would exhibit the same pattern we see happening in the world around us.
Let that linger for a few seconds.
The silver lining is the color we’re surrounded with. We poses far greater intelligence and agency than the simulated beings. With that, and with our view from the top, comes great responsibility.
Bringing awareness is great, yet there’s no amount of awareness and activism that would make it possible for you to buy an electric car when you became aware of the negative effects of fossil fuels, if there wasn’t for Elon Musk led revolution in electric vehicles.
I’m very passionate about this and I’m doing something about it. Send me a message or post a comment if you’d like to do something about it too, even if you don’t know what you could do. Or tell me in a comment or a message what you’re doing and I’ll see if I can help!
Now that you’re aware, what are you going to do about it?
Salt Lake City is beautiful, mountains are lovely, everything is soooo clean
Being in Salt Lake City for a few days, it would be a shame not to explore the city a bit, especially with the weather servings us so nicely.
Whoever maintains the city deserves a medal. Everything was so clean that you can almost eat from the floor if you could find any food on the floor. Which you couldn’t.
If we were in the tulip bubble right now, Salt Lake City would be the richest city in the world with tulips, like good plated art pieces, all over the place, creating fairytale-looking scenes like this.
Angular team is committed
Being just stable is not good enough. The world is changing, the user expectations are changing, the developer experiences are also changing, we need to keep up. We need to make sure that Angular doesn’t stay behind and becomes stagnant.
Winter Ivy is coming
It’s not here yet, but when it comes it’s gonna be awesome. Maybe. Probably. It’s not going to fix everything, but it might. It’s gonna be awesome fo’ sho’. Maybe next year?
Seriously, tho… It’s going to be great, mostly.
The main, and very appreciated, reason for the delays is backwards compatibility. Nobody needs another AngularJS to Angular transition.
A build tool thing to be exact. Like Make, Maven and Gradle. Not like webpack. Webpack is a module bundler, it doesn’t replace webpack, it works with it. Modern bundlers kinda do things that build tools used to, but with Bazel they don’t have to/shouldn’t.
Among its superpowers are building only what’s necessary and remote execution. Think scaling by adding nodes for parallel actions, consistent execution environment for the whole team and reusing build output between members. These alone bring huge build time reductions for big projects and large development teams where build (and unit and e2e testing) time can go into hours.
Google is using it internally and betting on it big. See the getting started.
You can have your website in space
Eric Simons demoed StackBlitz, a progressive web app that’s a full development environment. If you like VSCode, you’ll love this.
It was the snappiest, smoothest demo of the conference with some quite complex deploy processes, and redirecting a few hundred people to their twitter profile thanks to hot reload. Very smooth, Eric.
But Z, I want to have my website in space!
Eric and his team teamed up with Interorbital Systems to lunch a computer into space and host your code on it. What it lacks in speed it makes up for in… actual speed, being on a rocket and all, and the cool factor.
They made a video, so take a look.
How do you get a website in space?
Click the “Deploy To Space” button on top-right of your StackBlitz project.
Cypress and time travel testing
Cypress is the test tool of ng-conf 2019. Dubbed a test runner built for humans. It’s an open source, front-end testing tool, built for the modern web.
It takes snapshots as your tests run. You can simply hover over commands in the command log to see exactly what happened at each step. That’s how you time travel in your tests.
When you deploy to a server, you normally have (or can make) available all the tools that you have on your local machine, for seeing what’s happening and debugging live code. When developing serverless, you, well… don’t have a server… and can’t do any of that. So you’re left to guess what went wrong. You can’t ssh into serverless. But if you don’t have a server, you don’t have to pay for its time running.
To get closer to best of both worlds Google serverless now allows you to add breakpoints to your production, running, code and to add conditional logs without having to redeploy anything.
After you trigger that code by accessing the page you want to debug, you can then time travel (kinda) because your logs and stacks are saved for every breakpoint.
Really powerful and very nicely packed stuff that I would like to have even in a server environment.
Render on the server for speed and SEO (Angular universal)
Angular universal will render the initial page to the user (and to search engine) right away, then load your app code, render it and show in place instead of the initial code.
It fixes some problems and introduces some new ones. Angular team has some interesting potential solutions they are experimenting with. And Ivy, ah Ivy, will help with some of those ideas, maybe…
How to think reactively (RxJS)
RxJS is hot but most people don’t know how to think reactively. It’s a paradigm shift and we have to do it to get most out of it. subscribe is bad (if you can do it without it).
There’s a lot more to this and RxJS popped out all over the place throughout the conference but this was the overall theme that I picked up regarding it.
Schematics and CDK
Very underutilized items in my toolbox. Schematic is a template based code generator that supports complex logic. It’s useful for customizing Angular projects to suit a particular need of your own organization. Angular team thinks they are really powerful and wants you to use them more.
An item on my ToDo list.
Typescript on the backend
NestJS is a framework that embraces TypeScript and the power of Node. You can use it to write backends in a style that feels familiar to writing Angular apps all while embracing MVC architectures you can see in other modern backend frameworks. Since it’s based on Node, you can use all the familiar libraries you’re used to.
Write ones, run everywhere (cross platform development)
In a day to day life of a modern homo sapiens, we use multiple devices to get our information and interact with the digital/distant world. As developers, we want to provide our users with options. We want to give out the message: do it wherever you are, through whatever device you want it, with the best possible experience for the device you’re using.
To do that, it sometimes means the best experience is an app. A mobile app, a desktop app, on any platform. Different platforms require different native code. So how do we not duplicate our efforts?
Notable technologies to help with that, and they cross into each others territories in places:
In short, Angular components packaged as custom elements, a web standard for defining new HTML elements in a framework-agnostic way.
This is great because it allows you to create a web component with the full power of Angular in it. You can then use that outside of Angular context which brings out some very powerful possibilities.
There are possible workarounds, not yet promoted by the Angular team, like sharing Angular code on window object, like you used to do with jQuery.
Use it for what it’s good for 😉
Material is now Angular components, new stuff coming
People already call Material, Angular Components. Things around that aren’t going to change much, you’ll still use them the same way you’re used to. Angular team said they have stuff coming, though.
Those are not the only Angular components available (although they are supported by the Angular team), there’s been plenty at the conference.
There are some that weren’t (because the name speaks for itself?)
I think of sea. A peaceful summer morning before people crawl out of their beds. Sound of seagulls in the distance blending in with fade fog raising above the glass-like surface. Early rays of woken-up sun, reminding you it’s gonna be a hot day.
I remember early mornings, lugging sand with my dad, on a small cart, along a narrow, curvy path from far away parking lot, that was the closest place a truck could deliver it, to build a concrete deck so we wouldn’t have to sit in the dirt. That transitioned into being a house and took 12 full summers to build.
And a bumblebee, always passing by just around our second round. He would go his usual round circling the flowers in the same order and then land on my arm. He would stick around long enough to say “Hi!” and “What’s up, my favorite giant?”, and move along to the next patch of flowers.
“It’s all but a memory, never to be repeated again.”, I realize as I sit in our modern, Markham, Ontario, apartment, equipped with gadgets I built to (semi-)automate it. Life is now startups, Fortune 500s, electric cars, Mars shots, and building cool things out of nothing.
Croatia was never the things I remember about it. I’m in touch with everyone who wanted to stay in touch with me. It’s not always easy, but we make it work.
I remember pushing through the bushes with my brother and another friend. “Let’s get to the other side of the island in the straightest line possible”, I said, “it may took as a day but it’s gonna be super interesting”. Around lunchtime, we found ourselves on the top of the island, with a bird-eye view of a tourist-filled sandy beach. Boats floating in mid-air like dreamland, only anchors holding them down from escaping the earth. Standing on a patch of rocks, in the sea of thick green bushes, holding a piece of bread, pancetta and tomato, having lunch with a view.
This year seems exceptionally hard. Everybody, and their brother, is visiting Croatia. I’d like to think we did a great job promoting it. They ask me for tips. “How do I get most of my trip?” “Where should I visit?” “What is ______ like?”
And I answer them, provide them insights, help them have a fun, enjoyable trip and a great vacation. All the while knowing they are missing the point. They are visiting places I’ve never been to, eating food in restaurants I wasn’t able to afford. Croatia is party with a view.
It’s a figment of my imagination, the .1% time, serenity within insanity, never to repeat again, but possible to imitate. Maybe.
So if you visit Croatia this summer, go on a walk before the sun comes up and humans crawl out, along the rocky shore where no-one usually goes, after a non-drunk night, and wait for the sunrise. Look at the glass-like sea. See the mist raising close to the surface. Hear the seagulls in the distance and crickets on the top of the hill. Get drunk on Croatia I remember. And tell me I’m crazy.
This very interesting looking bridge is one of the first things I photographed when we came to Ottawa. The light wasn’t very special at the moment but I really liked the symmetry. (It would also be cool if the word symmetry was symmetrical)
Then we crossed it and ended up in Gatineau, Quebec. That’s why it’s locally known as Interprovincial Bridge, as well as Alexandra Bridge. Not sure why it’s not called Royal Bridge as well, but I guess two nicknames along the full name are quite enough.
It’s over 100 years old and used to hold some length records in its time but as it usually goes, someone younger came along and overthrew it. The younger chap was Quebec Bridge. Quebec now doesn’t have to share one side of the record with Ontario, I guess.
Workers had to constantly break the ice on the river when building the bridge. It’s Canada, so why I’m saying this?
Imagine approaching someone at the bar: “I’m an ice-breaker, nice to meet you!”
Again, it’s Canada so it’s 50-50 if they’d laugh at that…
Couple summers ago I saw a cool hack where a sensor, placed on the TV, detected where the emitter, placed in the glasses, was, and based on that information adjusted viewpoint of the 3D scene displayed on the TV.
Cool, right? And all you needed were these two simple devices and the right software.
Well, why use two if you can use only one? 😀
There is already a sensor mounted on your laptop screen, called a webcam, and if you’ve ever tagged someone on Facebook, you know face recognition is a real thing. That means your computer can literally “see” where you are if you teach it how to look.
It took me a couple of days to bang things up and have a proof that this is possible to achieve. It then collected dust on my hard drive for a year or two until I woke up a couple days ago with an idea how to tweak it to make it representable.
How it works
I used OpenCV library to detect faces and their location in the image webcam sees. Then I picked the closest, the biggest one, and did all the calculations with it. OpenGL is responsible for rendering the 3D image, and Qt (cute) is putty that binds them together, handling all the events and window management.
Several things influence the right viewpoint for the 3D scene:
Size of the scene you’re viewing
Angle you’re viewing the scene from
Distance from the scene
Those things depend on:
Size of the screen
Distance from the screen
Distance from the camera
Your camera lens width
Angle you’re looking the screen at
For real-life application, things can be simplified. With webcam attached to your display/laptop, distance to screen and distance to camera are very similar and oriented in the same direction.
For the demonstration, I simplified things further. Distance from the screen, for the lack of stereo vision, can be estimated.
Since I’m doing this on a laptop screen, on a table, I decided that length of 1 in OpenGL was about 1dm (decimetre), or about the size of a coffee mug. Distance from the camera and from the screen is roughly 6dm.
I had no choice with the webcam lens width and I couldn’t find any info about it online but this was enough to run the prototype and guestimate other parameters.
Position of your eyes in the image defines the x and y axis coordinates of the 3D scene viewpoint and depends on the lens width. Wider lens will have more coverage (left, right, up and down) than the more narrower lens, meaning that your face won’t appear in the same place on both cameras, even though they have the same resolution. Parameters should be adjusted accordingly.
Z axis represents the distance from the scene. You know how when you get closer to your window, the more of outside you can see?
Now imagine your screen being a window into a virtual world. It would have to do the same. That’s possible to adjust with the OpenGL camera angle. I tried to estimate distance from the screen based on the face surface size in the image. Unfortunatly, with my webcam, that proved too volatile for smooth experience. So I just used fixed distance of 6dm. This can be improved with stereo vision in the future.
First scene in the video is with laptop screen tilted to 45° angle and viewed from slightly above. I had to adjust the viewpoint manually since program is not aware of the angle. If gyroscope-like sensor was embedded in it like in most tablets and phones, it could have been done automatically.
Second scene is from the angle you would normally look your screen at.
Third scene is a screen capture of the program output with camera output overlaid on the bottom left. Blue rectangle marks the location where the face was detected. If you watch carefully, you can see how a false positive was briefly detected on my shirt. That’s why it pays to always pick the biggest detected face in the scene. Also, you can play-fight who will the program pick with a person sitting next to you 😀
Possible improvements and future work
Let’s start by saying that face detection can be CPU intensive. Especially when you do it 60 times a second. We can get that load off to GPU, and in fact OpenCV supports that kind of operation. Unfortunately for me, driver for my GPU doesn’t. Proprietary driver does, but doesn’t support extending my screen to second display. Guess which one I prefer 🙂
Another obvious solution is putting more interesting graphics inside that virtual reality window. I know boxes aren’t very interesting to look at. What would you like to see instead?
Since all the code is cross platform, it’s fairly easy feasible to port it to your phone or tablet. Putting that selfie camera to a good use, what would you do with this technology in your phone?
Technology limitations and possible improvements
I already mentioned the lack of depth adjustments to the image. Because we use a single camera, it’s hard to know the distance of the viewpoint. Using a higher resolution webcam might make guestimating smoother. Adding a second camera and calibrating them, it is possible to extract that information from the scene and adjust the viewpoint and virtual camera lens width.
Talking about stereo image, we humans naturally have two eyes, see stereo, and sense depth. It’s hard to fool your mind into thinking this 2D image is 3D unless you cover one eye 😀
That said, 3D video is here and you’ve probably experienced it. It is possible to produce a 3D video for the 3D TVs by rendering the 3D scene twice with a slight tilt of the viewpoint. To go even further, instead of detecting just face, it’s possible to detect eyes and use their individual locations in the stereo image.
That would bring your gaming experience to the next level, plus ducking when things are flying your way would actually make sense. 😀
What would you do with it? How could this technology help you?
Few days after Jasmina’s Sony Xperia Z charger broke, I had another unpleasant surprise. I woke up to a dead phone and no amount of charging, trying to turn it on, putting the battery in and out, helped…
How do I revive my phone’s dead battery?
Things to try before you jump start your battery
Turn the phone on by pressing/long-pressing the power button
Take the battery out. Press and hold the power button for at least 30s.
Try a new battery. Borrow from a friend who has the same phone.
Why this can help
First point is self explanatory. If your phone turned off because of a glitch or something, this turns it back on, voila!
Second point might seem a bit weird to most. You obviously won’t turn on the phone while it has no power supply. Good thing this isn’t the point here. This step discharges residue charge from your phone’s electronic components. This also works for laptops and I used it to revive quite a few of them inspiring statements like “It didn’t work before, I swear!”. I know, I believe you 🙂
Third point, tried after second, makes sure it’s a problem with battery and not the phone. You can also try your battery in your friend’s phone.
I didn’t have a spare battery to try with, so I used the cable described later in article and connected the red and black directly to the appropriate pins inside of the phone. My Samsung S3 turned right on, so that ruled out problems with the phone itself.
If you’re in the similar situation be careful, the right pins will depend on the phone you have. My battery’s plus and minus pads were marked so I just followed and saw witch pins in the phone they touch. You can follow the same idea to figure out where your phone’s plus and minus pins are.
Why your battery plays dead
I’m talking about lithium-ion battery (sometimes called Li-ion battery or LIB), which are most common batteries in phones these days.
They can go unstable and “boom” if they get too discharged. To prevent that doodoo, engineers put electronics in place with some smarts in them. If the battery voltage drops below a certain point, to prevent further discharge, they disconnect the cells from the battery pads completely. Thumbs up for safety!
So why can’t you charge it?
Your phone has a battery charger or else you’d have to take your battery out every time to charge it. Convenient!
In order for that charger to work, it needs electricity. It gets its electricity from the battery, which currently puts out 0V.
The Hack (this is what you’re here for)
All you need is a spare USB cable and a charger (or a laptop, play station, smart TV…).
Cut the USB cable.
Skin 5mm of red and black wires.
Twist the exposed end of each wire.
Plug the cable into charger.
Place the black wire (-) to the minus of the battery.
Place the red wire (+) to the plus of the battery.
Hold in place manually for a couple of minutes.
Put the battery back into the phone.
Plug the phone into the charger.
Your phone should soon start charging. If it doesn’t, try again, make step 7 longer.
Why this works
USB gives 5V which is above the normal Li-ion battery voltage (3.7V or 4.2V). This charges the battery enough to be briefly above the cut-off voltage. That’s just enough to start your phone’s charger and allow it to continue charging the battery until full.
I’ve been using and charging my phone normally since then. I also don’t allow my phone to die from empty battery. Lesson learned 🙂
Questions from comments
Could a universal charger work instead?
Yes, if it provides 5V of the correct polarity on the battery pads, as explained in “Why this works” section.
I’m struggling to get results, what else can I do?
You can trying boosting it, as described in step 7 of “The Hack” section, longer. Try couple of minutes or more, depending on the state of the battery, or even to a full charge. It needs enough juice to be able to take care of its charging process when you use original charger. You can also try upping the voltage/current a bit.
What about in-built batteries?
How can I revive built-in batteries? Devices with built-in batteries don’t normally give direct access to battery (why would they) and you access the battery only through the charging system you’re trying to circumvent.
Solution for that would be to open the device to get to the battery. There’s a whole new set of things to watch out there, it depends on the individual device and it’s much riskier since you can break the device. Good news is the same principle applies once you have the battery in your hands.
iFixit should have the tools you need to get in and you can order them online.
If your device is bricked and you can’t take it to the shop, you might as well have a little fun with it and have a chance of fixing it, but be careful and don’t do anything you’re not comfortable with.
What if the wires are not the same colors as here in the picture when I cut the cord?
The best way would be to use a voltmeter to determine polarity (to find out where’s the plus and where’s the minus.)
If you don’t have a voltmeter, you could use an LED with a resistor in series to determine polarity.
Can the mobile battery explode while jumping it?
It’s highly unlikely, but I can’t rule it out. Likelihood goes up if battery is physically damaged.
Also, remember Galaxy Note 7? So, sure, but I wouldn’t bet on it.
What’s the difference between this reviving phone battery by jump-starting it, and charging it normally?
The goal here is not to replace your phone charger (although you could). The difference is bypassing phone’s charging system which refuses to charge the battery if it’s bellow a certain threshold.
Charging it by wires will briefly boost the voltage (revive the battery) which will enable phone to detect it and continue charging it to full.
You don’t really need to charge the battery all the way on the wires, merely boost it enough to get over the voltage that allows the phone to detect it and charge it.
Have you ever tried this? Do you have any other ways to revive dead batteries?
Have you ever found some cool looking peace just lying around the hall? What did you do with it?
I’ve found a chest with my name on it, just lingering in the corner of the hall, in a house half way across the world. Curious, as I usually get, but cautious, as I also usually get, I cleared the furniture, put the chest in the middle and set the camera on timer.
I set down against the wall, contemplating, mentally preparing myself for what can be in it. Some childhood nightmare, hidden away to allow happy memories to thrive? Hidden treasure to enable me happy life once I become of age and venture to the wonderful new world? Something must have struck a chord…
The chest lid sprang open!
A flash of bright green light illuminated the hall. I jumped from my seat and flew toward the chest to close it: “I’m not ready yet!”
My day started with Jasmina’s voice. Normally that makes me smile. With my face still in the pillow, and blanket over my head, I was trying to ignore the words:
“Zvonimir, I broke my phone charger!”
Three minutes later, I was on my laptop searching “Sony Xperia Z charging port replacement”. Relatively easy fix, requiring a bit of soldering. With all my tools left in Zagreb, next search was “Ottawa mobile phone repair”. Few phone calls later, we were pressing PRESTO cards in the bus, getting lost on the way to down-town Ottawa.
First shop was willing to fix it for $100, second told it should be about $50 but they need to see the phone first. Upon seeing the phone, they gave up since they don’t want to solder things, but told us we can order magnetic charger for $40 online.
Few hours later, back at my laptop, searching for “wireless charging Sony Xperia Z”.
One type of chargers has a dock with two pins in it. You place the phone in the dock, magnets lock the phone in place, pins touch the pads on the phone, and it starts charging.
The other type has a USB cable on one side and pins and a magnet on the other side.
So it’s really simple, getting voltage (5V) on the phone’s charging pads. Let’s finish this tutorial style:
Cut the USB cable.
Skin 5mm of red and black wires.
Twist the exposed end of each wire.
Plug in the USB cable into charger.
Place red wire on the top pad.
Place black wire on the bottom pad.
Notice that the orange charge indication LED on the top-right of the phone is on 🙂
Do you have some no-budget or low-budget hacks of your own?
This post answers the question “What camera do you use?“, plus much more. It’s been a long time coming because, frankly, I don’t use that much equipment, and equipment I currently use is not that high end.
This is good news for you if you’re just starting out cause it means you don’t have to break the bank on equipment and still get results like mine and better.
You’ll find links throughout this article that ease you finding products I’m talking about. If you buy things through these links, I might make a small percentage of that without costing you anything extra. I hope this is fine with you, these are here to help you ease your search.
I started with my dad’s compact Sony DSC something and experimenting with manual mode and long exposure. That lasted one summer but dad was hesitant to let me use his camera. So, long story short, getting my own camera.
After a long and tedious search and comparison, based on my desired photography genre and budget I had, I decided on Nikon D5100.
So far, D5100 has proved to be an excellent camera with sensor size of 23.6 x 15.6 mm, 16.2 effective megapixels, 14 bit raw image, ISO 100 to 6400 (up to 25600 equivalent) and a lot of other goodies. There are some deficiencies I’m overcoming with accessories and tricks. Like triggering a remote speedlight.
I bought the camera with the kit lens, Nikkor 18-55mm f/3.5-5.6 AF-S. It’s a great general purpose lens for the money. I like using 18mm side at f11 to f22 on tripod for landscapes and 55mm side at f5.6 for food and people photos. There are many exceptions to these rules of thumb but this describes the possible uses quite well.
Nikkor 18-300mm would be another awesome (better) general purpose lens, though more expensive. I’ve also missed a wide-angle lens often when shooting interesting places, so Nikkor 10-24mm would be awesome to help as well. I do a lot of food photography and sometimes people photos so 50mm prime lens you can open wide to get a shallow depth of field is also a nice thing to have.
I’ve put everything that isn’t camera or lens in this section.
Nikon SB700 is a great addition to the arsenal for food photography and people photography. I even used it to light-paint some landscapes and when I was shooting club scenes (night-life). It’s possible to remote trigger it with on-board flash on D5100, even though the camera itself doesn’t have those remote triggering features. I’ll explain how to in some later post and link it here. Remind me if I don’t 😉
Samsung S3 is currently my main phone and a secondary camera. I use it often when I don’t have my Nikon with me. It has pretty good camera on it, but the low light performance could be a lot better. I’ve also used it to remote control D5100 instead of a remote shutter controler. I’ll explain that too.
Nexus 7 is a tablet I keep handy when going for a shoot, or often when I’m making videos with the Nikon. It’s possible to pair it wirelessly with Samsung S3 which is connected to camera using USB OTG cable and then have live preview and camera controls on it. I also use it to edit photos taken with S3 since Snapseed is currently stalling when processing photos on S3. No idea why.
To be fair, 2012 version gets super slow and frustrating sometimes since the Android 5.0 update. 2013 is said to have that resolved. If I wait it out, it starts working snappier again. Also, reboot works as a fix for a while. Advice from the community is get something with more RAM.
Rechargeable AA batteries for the speedlight is a must, 2 packs of 4 preferably, if not more. Chargers for the AA batteries and camera battery.
I have a $9 tripod I borrowed from my dad and never returned it. He doesn’t use it anyway. Hopefully not because he can’t get to it 🙂
I should get a sturdy tripod with a ball head and carbon legs. That would help a lot with positioning the camera.
I also have a small Manfrotto tripod I got as a present and use it for those cool, low angle shots, and a lot of other times for placing it on furniture, walls and similar when shooting video or food photos.
HP ProBook 4540s with Intel i7 CPU and 8 GiB of RAM and 750 GB HDD. Also an external, 1TB ADATA USB 3.0 HDD for backup. I use it not only for editing but also for software development, web development and mobile apps development. It’s my main tool for work when I’m not shooting. Running Linux Mageia and it’s super snappy.
All this stuff has to fit somewhere so it can be carried around on shoots and trips. I was lucky enough to get Vanguard Up-Rise 43 sling bag as a gift. Sling is good because the camera is easily accessible and it takes only a few seconds to get it out and shoot if you pre-set the camera settings. It’s got a lot of handy pockets and compartments, including a rain cover, who’s usefulness caught me by surprise when I needed it most.
With all this in it, it can get a little heavy on the side on long walks because of uneven weight distribution. That’s something I haven’t anticipated, but I do walk a lot. For walks like hour or two should be fine. For anything longer, I’d go for a normal backpack with camera out and on the strap if you are sight-seeing. That’s probably what I’ll go for next.
Now you know.
My bag is still not full and there are some things on my wish-list. If you are starting out, did this encourage you to get out and start shooting even with your smartphone?
What’s in your bag? What’s the basic you cannot go without?