Asking how a new generation of co-working spaces and networks are impacting your city.Find Out More
This site has no mission. It serves as a meeting place between meeting places, joining you to a server hosted within London's Second Home. Using simple intelligence, the server mines content that you can find here. It's your touch card into a growing archive of information that responds to a network of arteries. The climate of how you work and live is developing, Enquire to Annotate hopes you'll check in and see how we expand. Stay tuned to see how you can contribute.
OS: Android 8.0 Oreo
CPU: Qualcomm Snapdragon 845 2.8Ghz
GPU: Adreno 630
RAM: DDR4 8GB
Display: 5.8" 2560*1312 OLED, up to 900nit (~88.6% Screen to Body Ratio)
Camera (Rear): Dual 13MP, f/1.6, Phase detection & Laser autofocus, 4K@30fps, 1440p@50fps, 1080@60fps or 720p@240fps
Camera (Front): 8MP, f/1.9, 4K@30fps, 1440p@50fps, 1080@60fps or 720p@120fps
Battery: Li-Ion 3480mAh
Storage: 128GB, SD card up to 200GB
etc: USB 3.0 Type-C, Stereo speaker (HTC 10 style), IP67 Water&Dust resistant, Bluetooth 5.0, Rear fingerprint sensor
HEADPHONE JACK: F**K YES
I wrote the specification more detailed…maybe.
via WordPress ift.tt/2zNeQAv
Tracks steps,heart rate,walking distance,calories burnt,sleep quality,Blood pressure.
-Call and message Reminder
-Turn the wrist light screen
-Recording the daily goals
-Heart rate monitor
Detection of heart rate:
-Real-time testing and every 30 mins automatically save data to APP;
-Under the heart rate detecting interface,wait 6-7 seconds,will display real-time heart rate , it will keep this continuously track about 60 seconds,and the data will not save to APP.
Blood pressure:Normal and Personal
-Generally please select Normal model.
-If you are hypertension or hypotension,please select Personal mode,and select on My Device of Personal mode and set your data as well on settings at the same time.
-Automatic monitoring sleep duration and sleep quality, sleep can view the trend to improve sleep, and through the vibration alarm clock wake you quietly.
-The sleep monitor will start form 9:00 PM to 9:00 AM.
-Remind you to leave the seat for just the right amount of exercise.
CPU: Nordic N51822
Bluetooth Version: 4.0
Sensor: 3D Gravity Sensor
Trap Material: Silicone
Stand By: 5-7 days
Battery Capacity: 90mAh
Dust-proof &Waterproof: IP67
Battery: Built-in rechargeable lithium battery
System Requirement: Android 4.3 And Above, IOS8.0 And Above
What You Get:
1 x Smart Bracelet
1 x Charging Cable
1 x User Manual
Kindly NOTE:This device not for medical use,test data just for reference.
If you have any questions for the product, please feel free to contact us. We will reply you ASAP.
Tagged: , Wearable , Technology , Android , Blood , Bluetooth , Bracelet , Fitness , Foero , Heart , IP67 , Monitor , Pedometer , Pressure , Rate , Screen , Smart , Sports , Steps , Touch , Tracker , Waterproof , Wristband
Makita XLC02ZW 18V Compact Vacuum
(Bare Tool, Tool Only)
Buy On Amazon
Whether for use in the house or the workshop, an efficient vacuum is essential to keep things spick and span.
If you’re woodworking, cleaning up dust and debris is a question of both safety and comfort. All power tools tend to kick up large volumes of menacing dust and a large upright vacuum is not necessarily the smartest option for combating this. The good news is, you have other options…
With today’s Makita XLC022ZW compact vacuum review, we’ll break down another extremely solid cordless that will grace any workshop.
We’ll have a swift glance at the pros and cons of this stick before exploring it in more depth…
* First-rate suction power that remains constant throughout charging cycle
* Impressive 33 minutes of action on a full charge
* Accepts Makita 18V LXT battery (lithium-ion) if you want to maximize fade-free run time
* Enhance floor nozzle perfectly designed to help maneuverability and to suck up debris from tight spots
* Weighs just over 3 pounds when battery is attached so clean up without feeling the strain
* Rubberized handle gives you both comfort and control on the job
* Bagless with a dual-filtration system meaning improved efficiency and ease of cleaning
* Washable filters for ongoing economy
* 3-year limited warranty for your peace of mind
* Limited to Makita battery and charger and you need to buy this separately bumping up the cost
* Works better with fine dust than large chunks of wood
Assuming you have a decent vacuum cleaner in your house, you most likely don’t want to drag this out into the workshop. A great solution is to buy something much lighter and more flexible like a cordless stick, sometimes known as an electric broom.
The Makita XLC02ZW is a powerful but agile stick vacuum giving you all the flexibility of a cordless unit along with a pretty impressive run time. Contrary to the Amazon listing underselling the battery life, you’ll get around half an hour of continuous suction from this stick. Charging only takes around 20 minutes so, once the battery is flat, you’ll be back up and running in no time.
One downside is that this comes as a bare tool. You’ll need to invest in a Makita 18V lithium-ion battery and a dedicated charger so make certain you’re clear on this to avoid any nasty surprises. If you want to boost the run time, consider picking up Makita’s LXT battery.
You can opt to buy it in kit form at extra cost. This will yield you 3 extra cloth filters.
Thinking of filters, the dual-stage system included a filter and pre-filter. These are washable and reusable so you won’t need to continually replace them. A pair of neat locking tabs keep the filter firmly in place so you won’t end up with any unexpected messes.
Since it’s bagless, you’ll also save money here over time. While bagged vacuums might allow you to ditch the contents without coming into contact with any dust, the running costs can be prohibitive. These costs are further increased since you need to empty the bags before they’re much more than half filled. There’s no such nonsense with the Makita. Simply ditch the contents of the dust cup into the trash and you’re good to go.
The Makita has a very small footprint which means it’s a breeze to use and a cinch to store away without clogging up your workshop. It’s just over 18 inches long and weighs a mere 3.2 pounds when the battery is in place. Whether you want to clean up in the workshop or give your cars a quick once-over, the Makita will work wonders.
With a well-crafted rubberized handle, you’ll be able to take care of business without any cramping or blisters. The stick is nicely balanced so you won’t find it toppling over, a bugbear that blights many competing stick vacs.
The outstanding 3-year limited warranty lets you buy with complete confidence and is a nice touch for a vacuum at this price point.
If you are on the trail of a nimble little cordless stick that’s perfect for use in the workshop, home or car, the Makita XLC02ZW makes the ideal choice.
It’s pretty reasonably priced for a vacuum with such brand heritage but you need to factor in buying a battery and charger. You’ll be limited to Makita’s proprietary attachments so bear this in mind when you’re budgeting.
For a no-nonsense cordless vac that comes well-guaranteed and positively reviewed, you really can’t go wrong. Road test the Makita today so there’s no excuse for a messy workshop!
Feel free to get in touch any time with your questions or feedback. We love to hear from our readers and we’ll help out in any way we can.
Be sure to come back soon for our upcoming list of the 10 best air cleaners and dust collectors on the market.
Makita XLC02ZW 18V Compact Vacuum Review
Assuming you have a decent vacuum cleaner in your house, you most likely don’t want to drag this out into the workshop. A great solution is to buy something much lighter and more flexible like a cordless stick, sometimes known as an electric broom.
Recommended For You
November 30, 2017
Makita XLC02ZW 18V Compact Vacuum Review
rxHAVcWSmedS49j2017-11-30 01:30:462017-12-02 01:28:53Makita XLC02ZW 18V Compact Vacuum Review
November 30, 2017
Kaercher DDC 50 – Drill Dust Collector/Catcher Review
rxHAVcWSmedS49j2017-11-30 01:26:322017-12-02 01:28:11Kaercher DDC 50 – Drill Dust Collector/Catcher Review
November 30, 2017
DataVac Electronic Dust Blower Review
rxHAVcWSmedS49j2017-11-30 00:01:192017-12-02 01:25:42DataVac Electronic Dust Blower Review
November 29, 2017
Best Dust Collectors and Air Cleaners
rxHAVcWSmedS49j2017-11-29 17:52:052017-12-02 01:33:17Best Dust Collectors and Air Cleaners
Share this entry
Share on Facebook
Share on Twitter
Share on Pinterest
Share by Mail
The post Makita XLC02ZW 18V Compact Vacuum Review appeared first on A Place for Us Blog.com.
Broken gaming graphics card heat sink & cooling fan
Tagged: , blackandwhite , graphicscard , HD4850 , heatsink , coolingfan , macro , MacroMondays , GamesOrGamePieces , dusty
Tagged: , Alexandria Brangwin , Second Life , 3D , CGI , Computer , Graphics , Virtual , world , photography , Mercedes Benz , AMG , G63 , 6×6 , off , road , truck , silver , desert , highway , stopped , headlights , FBI , field , agent , outfit , Sicario , Emily , Blunt , massive , tires , tread , sand , dust , dirt , beam , differential , leather , boots , cargo , pants , tactilneck , Archer
via WordPress ift.tt/2j9EfgV
Country: UKBudget: ~£800, more if necessaryPurpose: gamingRequirements: high performance in current and upcoming games.
Intel Core i3-8350K 4.0GHz Quad-Core Processor
ARCTIC Freezer 7 Pro Rev.2 (MX2 Thermopaste)
MSI Z370 SLI PLUS ATX
Patriot Viper Elite 8GB (2 x 4GB) DDR4-3000
Samsung – 850 EVO 250GB M.2-2280
Toshiba 1TB 3.5
MSI GeForce GTX 1060 3GB GT OC
BitFenix Nova ATX Mid Tower
Corsair Builder 430W 80+ Bronze
Base Total: £772.32
Shipping: £21.96Total: £794.28
Note: all prices are current as of November 2017.
Recently I was contacted by a UK resident, asking for some help building a gaming PC. They did not set a specific budget, but instead provided me with a sample build and asked for my input. That build had good intentions, but lacked any direction, to put it mildly.
So instead I suggested a build with a similar price, but with more focus on gaming performance and overclocking. Meet the Balanced Gaming Build.
CPU: Core i3-8350K
This is a recently released mid-tier Intel CPU. Intel CPUs are known for their excellent per-core performance, and this CPU can be overclocked.
Overclocking makes the CPU work at a higher frequency (making it more powerful), at the cost of drawing more power and producing more heat. Maintaining a stable and effective overclock requires that other components, such as Motherboard, Cooling and RAM, be of higher quality as well. So pretty much the whole build is centered around overclocking.
When pushed far enough, overclocking will also reduce CPU’s lifespan, but PC components usually go morally obsolete long before reaching the end of their lifespan.
This i3 is the cheapest “decent” CPU you can get. It is still fairly expensive, but it offers excellent performance / cost ratio, and overall makes for an economical choice right now.
For example, a 6-core i5 8600k costs £90 more, but with equal clock frequency it has only slightly better performance in games.
Another example: 6-core i5 8400. It has maximum Turbo Boost frequency of 4.0 GHz, same as stock 8350k. They also cost about the same, with i5 8400 being marginally more expensive.
However, i5 8400 cannot be overclocked, and in most games will lose to overclocked 8350k.
Six cores might be more relevant in the future, where we could potentially see more multi-threaded games, but it doesn’t make sense to pay extra now just so you could maybe have better performance in a few years. At that point, it would be It would be better just to upgrade to another CPU.
Currently, all motherboards that can work with Coffee Lake CPUs have Z370 chipsets. They allow to overclock “k” CPUs by multiplier. Motherboards with cheaper B- and H-series chipsets are not available yet.
When they do become available, i5 8400 might become a more competitive choice, because going for a non-overclocking build would significantly reduce overall cost.
But right now, you’re paying a premium for a motherboard that can overclock regardless of whether you actually intend to overclock or not. In these circumstances, it doesn’t make sense to go for i5 8400.
CPU Cooler: ARCTIC Freezer 7 Pro Rev.2
This inexpensive cooler is powerful enough to easily handle overclocked i3 8350k. It comes with a high quality MX2 thermal compound, and the fan uses a Fluid Dynamic bearing, which makes it very durable. Comes with 6 year warranty.
Motherboard: MSI Z370 SLI PLUS ATX
This motherboard is a bit unorthodox choice for this build, because clearly we’re not going for SLI. Moreover, SLI is not something I’d recommend to anyone outside of some very specific circumstances.
However, even if we are overpaying for unnecessary SLI capability, this motherboard still makes a great pick. It is fairly inexpensive, and its 10 phase power delivery system will ensure stable and powerful CPU overclock. Heatsinks on the VRM system further improve overclock quality and system longevity.
MSI motherboards come with loads of useful performance-enhancing features, and they can automatically overclock the CPU in one click, so it will be super easy even for those who’ve never overclocked before.
Memory: Patriot Viper Elite 8GB (2 x 4GB) DDR4-3000
This is a high quality, high speed memory that will ensure the 4-core CPU will not become (as much) of a bottleneck in highly-threaded applications.
8 GB might not be as comfortable as 16 GB would, but it should be enough for the next few years. We are more or less trying to stay in a budget, after all.
SSD: Samsung 850 EVO 250GB M.2-2280
This is the cheapest decent SSD available at the time and place, costing as much as MyDigitalSSD BPX 128 GB.
850 EVO’s 250 GB is enough to house operating system and other programs, and a couple of games, but the rest of the storage will have to be handled by a hard drive.
Storing the Operating System and programs on an SSD significantly improves performance and load times, that’s why having at least some form of SSD is highly recommended.
However, if you don’t care about load times at all, you can in fact save a lot of money by not getting any SSD at all, though this approach becomes less and less popular.
HDD: Toshiba 1TB 3.5″
Toshiba makes the most reliable HDDs at the moment, with excellent quality to cost ratio. A perfect choice for any mainstream build.
Video Card: MSI GeForce GTX 1060 3GB GT OC
AMD and nVidia often bump heads at this price point, depending on GPU performance and the amount of available VRAM.
VRAM is a weird beast. You either have enough or you don’t. If you have enough, adding more RAM won’t do anything for you. If you don’t have enough, gaming performance will plummet.
However, it is always possible to reduce certain settings to reduce VRAM consumption. Resolution, anti-aliasing and texture quality are the biggest VRAM eaters.
First up is GTX 1060 3GB variants. The GTX 1060 itself offers excellent performance, and 3 GB of VRAM is enough to play vast majority of current titles at good settings at 1080p resolution.
Right around the same price point, there is RX 570 4 GB. It performs slightly worse than GTX 1060, but some extra VRAM may come in handy later down the road.
Then there is RX 580 4 GB. It is as powerful as GTX 1060, but ~10% more expensive than GTX 1060 3 GB.
Finally, there is ~28% more expensive GTX 1060 6 GB version, which also has about 5% better performance than a slightly cut down chip of the GTX 1060 3 GB.
If you don’t plan on using resolution higher than 1920 x 1080, and you’re fine with occasionally turning down a few specific settings, GTX 1060 3 GB makes for a really economical choice.
It’s not as future proof as it could be, but even if you are faced with VRAM issues in a few years, it would make more sense to sell your GTX 1060 3 GB then, and get another Graphics Card, which should both offer better performance and come with more VRAM.
Otherwise, RX 570 and RX 580 seem like good “in between” solutions. GTX 1060 6 GB seems hard to justify.
MSI GTX 1060 3GB GT OC in particular offers good clocks and cooling for its price, though I wish it had a dedicated heatsink for the VRM system.
Case: BitFenix Nova ATX Mid Tower
This a really cheap case. There is nothing particularly wrong with it; it does what a case is supposed to do, but it’s not necessarily the most convenient in terms of assembling and maintenance.
If you don’t mind spending some extra time fiddling with cables and crawling around with a screwdriver, then this case is perfectly fine. Otherwise, I suggest something more expensive, like Zalman Z3.
In addition to more convenient assembly and cable management, more expensive cases are more likely to have higher quality ports on the front panel, and they often come with nice extras, such as fans, dust filters and removable carriages for HDDs and SSDs.
Power Supply: Corsair Builder 430W
Corsair makes excellent, reliable and durable power supplies, though this is one of their cheapest products.
430 Watt may seem unusually small, but GTX 1060 is not particularly power-hungry, so it should be more than enough to power overclocked CPU and GPU, and have plenty of juice left for other components.
Comments and Considerations
This is definitely a fine gaming build, but I am not as happy with it as I was with the previous $2200 “Make your dreams come true” build.
Things that I would consider changing:
Getting a better case or at least an extra case fan. The BitFenix Nova comes with only one case fan at rear exhaust. I would like to add one intake fan to the front panel to supply some fresh air to the Graphics Card.
Getting an extra cooler for Motherboard’s VRM system to ensure stability and longevity of the overclock. It is highly likely an unnecessary overkill, as 10 phases and heatsinks should already provide more than enough durability, but better safe than sorry. What’s a $10 fan and a couple of paper clips compared to peace of mind?
Normally I would just pick a CPU Cooler that directs some airflow towards the Motherboard, as I did with the previous $2200 Build, but in this time and place there were no coolers available that would be able to handle an overclocked 95 Watt TDP CPU and still fetch a modest price.
Getting a higher grade Power Supply. While there is no reason to doubt Corsair in this regard, I would feel a bit more comfortable with a 500 or even 550 Watt PSU. It would also somewhat “future proof” the Power Supply itself, making it more relevant in future builds, which could be potentially more power-hungry.
Getting a GPU with more VRAM. Enough said about it in the GPU section.
This build is not as efficient as it could be. While overclocked i3 8350k offers excellent performance, it has no Turbo Boost, so it runs at higher frequency ALL the time, drawing more power and deteriorating faster than it should.
Power Supplies are also usually more efficient at load that is significantly below maximum.
This build is not as “future proof” as it could be either. Bare minimum of RAM and VRAM, 4-thread CPU, bare minimum power supply, no VRM heatsinks on the GPU. There’s no airflow through Motherboard’s VRM either, though it’s the smallest problem, and even then it could be easily corrected.
However, not every build has to be “future proof”. In fact, “future proof” builds are hard to justify economically. Overclocked i3 8350k is enough to tear through vast majority of current and upcoming titles. So is GTX 1060 3GB – with a few concessions.
It will be ultimately cheaper and better to upgrade specific components when it becomes necessary, swapping them out with the next generation of mainstream components with good value.
This closes this build. If you’d like for me to make a PC Build for you, check out my PC Building Services.
£800 Build: Balanced Gaming syndicated from ift.tt/2zXu06U
Tagged: , Uncategorized
via WordPress ift.tt/2jFegSc
Country: USABudget: ~$2250, including a monitorPurpose: gaming and light workRequirements: easy dusting, capability to last a long time without upgrades. Ideally, a decade. Storage: ~256 GB SSD, 1 TB HDD.
Intel Core i7-8700K 3.7GHz 6-Core Processor
be quiet! DARK ROCK TF
ARCTIC – MX4 4g
MSI Z370 TOMAHAWK ATX
Patriot Viper 4 (2 x 8GB) DDR4-3400
MyDigitalSSD BPX 256GB M.2-2280
Western Digital Gold 1TB 3.5″
Aorus GTX 1080 Ti Xtreme Edition 11G
Phanteks ECLIPSE P400 Mid Tower
Corsair RMx 650W 80+ Gold
2x ARCTIC F14 PWM PST CO 140mm
AOC G2460PQU 24.0″ 1920×1080 144Hz
Base Total: $2239.68
Promo Discounts: -$10.00
Mail-in Rebates: -$30.00
Shipping: $0.99Total: $2199.67
Note: all prices are current as of November 2017.
One of my recent customers from United States asked for my help with creating a PC Build. They wanted to get the best PC within ~$2250, including a gaming monitor. The case should provide easy access for dusting, and the PC itself should be able to last a long while without any upgrades. Their previous PC lasted a decade, and they wanted the same from their new PC.
You can see the build I suggested in the spoiler above, it was accepted and the happy customer donated $25 as a token of their gratitude for my services. With their permission, I am publishing the build, along with my reasoning why for each specific part was selected.
CPU: Intel Core i7-8700K
This beast of a CPU does not need an extensive introduction. At the moment, this is the best gaming CPU money can buy. Excellent per-core performance will carry the CPU in the present, and 6 cores with Hyper Threading ensure the CPU stays relevant in the future, where hopefully multi-threading becomes more commonplace.
It overclocks well, and we’ll definitely be counting on overclocking to future proof the build.
CPU Cooler: be quiet! DARK ROCK TF
Here I was looking for three things:
1. Airflow directed towards CPU and motherboard. This lets some of the airflow to reach VRM heatsinks on the motherboard, greatly reducing chances of power system failure, and generally increasing system’s lifespan and overclock quality. Relevant article.
2. Hydro / Fluid Bearing of the cooler fans. This type of bearing offers by far the best longevity.
3. Cooling powerful enough to handle 95 Watt TDP of the 8700k + some headroom for overclocking
At first I considered cheaper and less powerful Slimhero, but since budget could handle it, I decided to go with be quiet! DARK ROCK TF, which is a good deal more powerful, and unlike the Slimhero doesn’t block any RAM slots.
Instead, DARK ROCK overlooks the RAM slots, giving them a good portion of the airflow. I made sure there is enough room under the DARK ROCK to fit the suggested RAM modules.
DARK ROCK comes with an unspecified thermal compound, so I decided to also get ARCTIC MX4, which offers both excellent heat conductivity and longevity, with the manufacturer claiming it to be able to last up to 8 years. I still recommended to reapply it after 5 years, though.
Motherboard: MSI Z370 TOMAHAWK ATX
MSI are a great brand overall, they offer many useful features that make overclocking better and easier. OC Genie will automatically overclock the CPU for you, and all MSI Z370 motherboards come with Load Line Calibration (LLC). In short, LLC reduces negative effects of overclocking on system stability and longevity.
The Tomahawk comes with a 10-phase VRM with heatsinks, which also increase system longevity and overclock quality.
This is pretty much the ideal motherboard for this build. The only thing we’re overpaying here for is ATX form factor, which is not really necessary for this build, but there aren’t many good mATX Z370 motherboards to choose from at the moment.
Memory: Patriot Viper 4 (2 x 8GB) DDR4-3400
This is a high quality RAM set with good reviews and excellent performance. 16 GB should be enough to last a long while, but I warned the customer that in 5-7 years an upgrade may be necessary.
SSD: MyDigitalSSD BPX 256GB
SSDs use several types of Flash memory: SLC, MLC and TLC (single-, multi- and triple- level cells respectively).
They go SLC > eMLC (enterprise MLC) > MLC > TLC in terms of price, speed and durability.
For an average consumer, TLC is usually good enough, but we are building a really durable machine that can last a decade. This is why I recommended against Samsung 960 EVO, which uses mostly TLC memory.
MyDigitalSSD BPX 256 GB uses MLC memory, and offers minblowlingly amazing value for its price. It is by far the most durable and performing consumer-grade SSD on the market, and comes with a 5 year limited warranty – “limited” implies you do not exceed a certain amount of written data.
Ideally, I would really like to go for eMLC or even SLC memory for this build, but SSDs like that are intended for enterprise customers and professional grade server equipment, and their cost is disproportionally higher.
As long as you don’t make a habit of moving large volumes of data in and out of the SSD daily, you are very unlikely to exhaust its resource. If it comes to worst, it can always be replaced with another ~$100 SSD later down the line. It simply doesn’t make economical sense to go for a vastly more expensive SSD just to avoid that one occasion.
HDD: Western Digital Gold 1TB 3.5″
Conventional hard drives have moving parts, so they are more prone to failure than SSDs, and we should pay a premium to ensure durability. We’re looking for something like an entry-level server-grade or data-center HDD.
There were two good choices here, Toshiba MG03ACA and Western Digital Gold WD1005FBYZ. They both have 7200 RPM speed and 1 TB size, and both should have excellent longevity, though WD Gold is more marketed. In the end, I went with WD Gold because of its larger buffer size (128 MB vs 64MB).
Video Card: Aorus GTX 1080 Ti Xtreme Edition 11G
Similarly to 8700k, the GTX 1080 Ti is the best gaming Graphics Card you can buy right now.
I went with the Aorus’ variant over the competition because it has a dedicated heatsink for VRM system, which ensures durability and overclock quality. Heavy heatsink with triple fans ensures it stays quiet under heavy load.
It also has a theoretical power maximum of 375 Watt, so there is a lot of headroom for overclocking.
Case: Phanteks ECLIPSE P400 ATX Mid Tower
This case offers easy cleaning and good cable management for a modest price. The front panel is easily removed, and so are several dust filters.
I also suggested getting a few aftermarket dust filters to cover the front panel, though they are not included in this build.
There’s definitely a lot of decent cases out there, but most of them are a good deal more expensive than what I’d consider justified.
Power Supply: Corsair RMx 650W
Corsair builds their PSUs to last, which is exactly what we need. The RMx series are built from even higher premium components than usual CX series, and come with a high-end bearing fan, ensuring it will stay quiet and efficient for years.
This particular PSU has a 10-year warranty, and it is also energy efficient. Fully modular cable system means unused cables won’t be left dangling inside the case.
How I arrived at the 650W number:
The 8700k draws up to 180 Watt when overclocked and under load.
Aorus’ GTX 1080 Ti draws up to 375 Watt – that’s the maximum physical limitation of 2x 8 pin power connections + 75 Watt from PCI-Express slot.
180 + 375 = 555 Watt
So we need a Power Supply that can output that many Watts, plus about 50 Watt for other devices, such as SSD, HDD and Motherboard itself.
Might also want to add an extra 50 Watt just to be safe.
This all comes down to 650 Watt.
Case Fan: 2x ARCTIC F14 PWM PST CO
Phanteks Eclipse P400 does come with two 120mm fans included. I wasn’t able to find what kind of fans they are. They are unlikely to use a high-end bearing, and will probably wear out with time and get noisy. Until that moment, however, there is no reason not to use them.
So we will be needing just a couple of extra fans.
The Arctic F14 offers a perfect combination of price, airflow and low noise. I use them myself, actually.
The MSI Tomahawk uses 4-pin PWM connectors for all of its case fans, which means it can be programmed to regulate the speed of case fans, making sure they work at maximum RPM (and maximum noise) only when it’s necessary.
We will engineer the following airflow in the case:
The lower front panel fan will capture the cold air outside the case and direct it towards the Graphics Card. Its triple fans will capture it, push through the heat sink, and disperse hot air all around the graphics card, heating other motherboard components.
The CPU cooler’s double fans will capture air in the case, and also push it towards the motherboard, pumping air through motherboard VRM and RAM heatsinks.
In this case, both Graphics Card and the CPU Cooler do not direct heated air in any particular direction. So instead of focusing on pumping cold air into the case, we’ll focus on directing hot air out of the case.
So our pair of Arctic F14 fans will be installed at the top side of the case, helping hot air escape the case.
Monitor: AOC G2460PQU 24.0″ 1920×1080 144Hz
This is a highly praised and well-reviewed monitor, and it’s more or less a steal for that price. It lacks G-Sync or FreeSync, but high refresh rate should compensate for it. Screen tearing is barely an issue at high framerates, and you will want to have a high framerate in competitive online games anyway. And for a more demanding and cinematic single player games, you can just use V-Sync, since you won’t care about input lag as much. Then again, the PC we’re building is likely to easily handle even heaviest titles at an excellent framerate anyway. In this context, it doesn’t really make sense to pay nearly double for a G-Sync monitor.
The only gripe with this monitor is that it has to be calibrated to properly display all the colors. It’s easy to do using a windows’ built-in calibration tool, but it may get tedious, as this has to be done after every windows reinstall.
Important to note that this monitor still uses 1920 x 1080 resolution, which is slowly but steadily going out of style. However, slightly larger 2560 x 1440, 27-inch monitors are at least twice as expensive and it’s hard to justify a huge display for gaming anyway.
Consider this article a preview of my PC Building Services.
$2200 PC Build: Make your dreams come true syndicated from ift.tt/2zXu06U
Tagged: , Uncategorized
Why even Google can’t connect Cuba
Reports say Google intends to help wire Cuba and bring the island into the 21st century. But that’s not going to happen.
By – Mike Elgan
Computerworld | Apr 18, 2016 3:00 AM PT
When President Obama said in Havana last month that Google would be working to improve Internet access in Cuba, I wondered what Google might do in Cuba that other companies could not.Today, Cuba is an Internet desert where only 5% of trusted elites are allowed to have (slow dial-up) Internet connections at home, and a paltry 400,000 people access the Internet through sidewalk Wi-Fi hotspots. These hotspots have existed for only a year or so. Also, some 2.5 million Cubans have government-created email accounts, but no Web access.I spent a month in Cuba until last week, and I was there when the president spoke. I’m here to report that those government Wi-Fi hotspots are rare, slow and expensive. While in Cuba, my wife, son and I spent about $300 on Wi-Fi. In a country where the average wage ranges from $15 to $30 per month, connecting is a massive financial burden available only to a lucky minority with private businesses or generous relatives in Miami.
And this is why I think the possibilities of what Google might accomplish in Cuba are misunderstood.It’s not as if Cuba would have ubiquitous, affordable and fast Internet access if it just had the money or expertise to make it happen. The problem is that Cuba is a totalitarian Communist dictatorship.The outrageous price charged for Wi-Fi in Cuba can’t possibly reflect the cost of providing the service. The price is really a way to restrict greater freedom of information to those who benefit from the Cuban system.The strange Wi-Fi card system is also a tool of political control. In order to buy a card, you have to show your ID, and your information is entered into the system. Everything done online using a specific Wi-Fi card is associated with a specific person.The Cuban government allows people to run privately owned small hotels, called casas particulares, and small home restaurants, called paladares. The owners of these small businesses would love to provide their guests with Wi-Fi, but the Cuban government doesn’t allow it. Nor does it allow state-owned restaurants, bars and cafes to provide Wi-Fi.Google is connected to the global Internet through satellite networks. Cuba is connected to the Internet by an undersea fiber-optic cable that runs between the island and Venezuela. The cable was completed in 2011, and it existed as a "darknet" connection for two years before suddenly going online in 2013.So here’s the problem with Google as the solution: The Cuban government uses high prices and draconian laws to prevent the majority of Cubans from having any access to the Internet at all. The government actively prevents access as a matter of policy. It’s not a technical problem. It’s a political one.In other words, Cuba doesn’t need Google to provide hotspots. If the Cuban government allowed hotspots, Cubans would provide them.
Everyday Google tech is ‘Art’ in Cuba
While I was visiting Cuba, a permanent "exhibit" called Google+Kcho.MOR was on display at an art and cultural center in Havana that also promotes technology. Kcho (pronounced "KAW-cho") is the nickname of a brilliant, enterprising, prolific and self-promoting Cuban mixed-media artist named Alexis Leiva Machado. Kcho lives at the center, which he deliberately built in the traditionally poor Havana neighborhood of Romerillo, where he grew up. The M-O-R at the end of the exhibit’s name are the initials of the walled, multibuilding compound: Museo Orgánico Romerillo.I took a Cuban death-cab to the Museo Orgánico Romerillo. And, no, the cab wasn’t one of those awesome American classico beauties from the 1950s that you see in all the pictures of Cuba. The vehicle was a tiny, charmless Eastern European clunker from the 1970s with a top speed of about 45 mph, stripped on the inside of all paneling and lining (presumably by a fire, because everything was black inside) and held together by wire, tape, glue and optimism — and I swear the exhaust pipe was somewhere inside the car. (Oh, what this correspondent isn’t willing to do for his cherished readers.)The exhibit is an astonishing oddity to Cubans who have never traveled abroad, but it’s packed with oldish, cheap, everyday Google gear: 20 Chromebooks, Google Cardboard goggles powered by Nexus phones — and something that has never, ever existed anywhere in Cuba: free Wi-Fi.Of course, there’s no such thing as free Wi-Fi, especially in Cuba. Kcho reportedly pays the Cuban government some $900 per month for the access. The free Wi-Fi, which I saw scores of locals using with their phones, is really subsidized. The Cuban government still gets paid. (The password for the free Wi-Fi is abajoelbloqueo — which translates, roughly, to "down with the embargo.")The free Wi-Fi is the same slow, unreliable connection that a minority of Cubans elsewhere get to enjoy, minus the cost and the cards. The Chromebooks, on the other hand, offer a magic Google connection some 70 times faster than regular Cuban Wi-Fi. Only 20 people at a time can enjoy the fast-connection Chromebooks, and each for just one hour at a time. When I was there, every Chromebook was in use, and each user’s focus on the screen was total, as you can imagine.The "exhibit" also had Google Cardboard viewers. (I had read the center has 100 of them, but I saw only about a dozen.) To use them, you ask a guy working there, and he grabs a Nexus phone from a drawer and walks you through the process of launching the Cardboard app and starting it. Each Cardboard viewer has preloaded content — in my case I enjoyed a Photosphere of Tokyo.During the half hour I spent in the Google+Kcho.MOR space, nobody else tried Google Cardboard. And that makes sense. With no ability to create or explore Carboard content, it’s just a parlor trick to be enjoyed for a minute or two. I got the feeling that all the people there had "been there, done that" with Cardboard and resumed their obsession with Internet connectivity.It was, however, obvious that the two people helping us were used to minds being completely blown by the Google Cardboard and Chromebook experiences. I didn’t have the heart to mention that I’ve owned several pairs of Cardboard for two years and Chromebooks for three years.The Google+Kcho.MOR installation is called an "exhibit," but it’s not. In reality, it’s a co-marketing, co-branding effort.For the Kcho "brand," it’s a "gateway drug" to lure Cuba’s youth to the museum and get them excited about art, culture and the world of Kcho. Along with a cheap snack bar, the free Wi-Fi and the hour a day on the fastest laptops in Cuba successfully bring hundreds of Cuban kids to the center each day, and the Google+Kcho.MOR is the main event.For Google, it’s a massive branding effort. (Google declined to comment for this story.)Nobody was willing to talk about it, but it’s clear that Google is spreading some cash around here. There’s so much Google branding on everything in and on the Google+Kcho.MOR building, it looks like it could be at the Googleplex itself.Even elsewhere in the compound, the Google logo is everywhere. It’s in several outdoor spots where the free Wi-Fi is used, including all over the snack bar that serves coffee and soda.If you’re reading this, you probably live in a country awash in marketing, co-marketing and branding on every surface. But the ubiquity of Google branding at the entire Museo Orgánico Romerillo compound may be unique in Cuba. This is a country without a single Coca-Cola sign or billboard, zero ads anywhere for anything (other than political propaganda for the revolution and its leaders and ideals).During the month I spent in in Cuba, I saw exactly six major public consumer branding units, and all of them were at the Museo Orgánico Romerillo, and all of them were about Google (and Kcho). That makes Google by far the most heavily branded and marketed company in Cuba — in fact, the only one.As far as I can tell, Google is getting away with it only because Kcho is massively favored by the Castro regime and the marketing is all presented as "art" or in the promotion of art.
What Google is really accomplishing in Cuba
Google appears to have begun its entry into Cuba in June 2014, when its executive chairman, Eric Schmidt, visited Cuba after slamming the U.S. embargo in a Google+ post. The visit was not reported in Cuba at the time.Schmidt was accompanied on his trip by Brett Perlmutter, who was later appointed Cuba lead for Alphabet, Google’s parent company, as part of the Jigsaw organization, a "think tank" that actually initiates programs for making the world a better place, and was formerly known as "Google Ideas."In January 2015, Perlmutter, as well as Jigsaw’s deputy director, Scott Carpenter, toured Cuba together.One of their goals on that trip was to visit computer science students at the University of Information Science, as well as young Cuban Internet users. Another goal, it’s easy to guess, was to meet with cultural figures like Kcho, and also key figures in the Cuban government.Put another way, Google has been making friends and laying the groundwork for a future when the Cuban government allows greater and better Internet access.No, Google isn’t laying fiber, launching balloons or installing equipment all over Cuba. It’s not planning to sprinkle fast, free, magic Google Wi-Fi all over the island.The best Google can do for now is make friends and influence people.Cuba won’t join the rest of the world in ubiquitous Internet access until the Cuban government either becomes less repressive, or falls out of power. When that happens, Google, as the dominant and best-connected tech brand, will be ready.Until then, no amount of magic Google pixie dust can help the Cuban people.
Tagged: , Cuba , US , 2016
Fully compatible with laptop computers with the size of 15-15.6" such as Macbook Pro/ Lenovo/ ASUS/ Samsung/ Acer/ HP as well as laptops from other brands
Protects laptop/netbook from dust, shocks, bumps, scrapes ,scratches and spills
Slim and lightweight; does not bulk your laptop up and can easily slide into your briefcase, backpack, or other bag
Top-loading zipper on the sleeve glides smoothly and allows convenient access to your laptop
Product Dimensions:16.4 x 11 x 1.2 inches
Tagged: , hp , laptop , cover , 13.3 , inch，macbook , air , inch , cover，hp , case , inch，asus , sleeve , 13.3，laptop , carrying , macbook , air，macbook , 13 , sleeve，macbook , sleeve，laptop , inch，13.3 , sleeve，neoprene , pro , briefcase，neoprene , inch，13 , neoprene，macbook , cover，macbook , case，macbook
Infrared HDR. IR converted Canon 40D (Lifepixel). AEB +/-3 total of 7 exposures processed with Photomatix. Layered in PSE13.
High Dynamic Range (HDR)
High-dynamic-range imaging (HDRI) is a high dynamic range (HDR) technique used in imaging and photography to reproduce a greater dynamic range of luminosity than is possible with standard digital imaging or photographic techniques. The aim is to present a similar range of luminance to that experienced through the human visual system. The human eye, through adaptation of the iris and other methods, adjusts constantly to adapt to a broad range of luminance present in the environment. The brain continuously interprets this information so that a viewer can see in a wide range of light conditions.
HDR images can represent a greater range of luminance levels than can be achieved using more ‘traditional’ methods, such as many real-world scenes containing very bright, direct sunlight to extreme shade, or very faint nebulae. This is often achieved by capturing and then combining several different, narrower range, exposures of the same subject matter. Non-HDR cameras take photographs with a limited exposure range, referred to as LDR, resulting in the loss of detail in highlights or shadows.
The two primary types of HDR images are computer renderings and images resulting from merging multiple low-dynamic-range (LDR) or standard-dynamic-range (SDR) photographs. HDR images can also be acquired using special image sensors, such as an oversampled binary image sensor.
Due to the limitations of printing and display contrast, the extended luminosity range of an HDR image has to be compressed to be made visible. The method of rendering an HDR image to a standard monitor or printing device is called tone mapping. This method reduces the overall contrast of an HDR image to facilitate display on devices or printouts with lower dynamic range, and can be applied to produce images with preserved local contrast (or exaggerated for artistic effect).
In photography, dynamic range is measured in exposure value (EV) differences (known as stops). An increase of one EV, or ‘one stop’, represents a doubling of the amount of light. Conversely, a decrease of one EV represents a halving of the amount of light. Therefore, revealing detail in the darkest of shadows requires high exposures, while preserving detail in very bright situations requires very low exposures. Most cameras cannot provide this range of exposure values within a single exposure, due to their low dynamic range. High-dynamic-range photographs are generally achieved by capturing multiple standard-exposure images, often using exposure bracketing, and then later merging them into a single HDR image, usually within a photo manipulation program). Digital images are often encoded in a camera’s raw image format, because 8-bit JPEG encoding does not offer a wide enough range of values to allow fine transitions (and regarding HDR, later introduces undesirable effects due to lossy compression).
Any camera that allows manual exposure control can make images for HDR work, although one equipped with auto exposure bracketing (AEB) is far better suited. Images from film cameras are less suitable as they often must first be digitized, so that they can later be processed using software HDR methods.
In most imaging devices, the degree of exposure to light applied to the active element (be it film or CCD) can be altered in one of two ways: by either increasing/decreasing the size of the aperture or by increasing/decreasing the time of each exposure. Exposure variation in an HDR set is only done by altering the exposure time and not the aperture size; this is because altering the aperture size also affects the depth of field and so the resultant multiple images would be quite different, preventing their final combination into a single HDR image.
An important limitation for HDR photography is that any movement between successive images will impede or prevent success in combining them afterwards. Also, as one must create several images (often three or five and sometimes more) to obtain the desired luminance range, such a full ‘set’ of images takes extra time. HDR photographers have developed calculation methods and techniques to partially overcome these problems, but the use of a sturdy tripod is, at least, advised.
Some cameras have an auto exposure bracketing (AEB) feature with a far greater dynamic range than others, from the 3 EV of the Canon EOS 40D, to the 18 EV of the Canon EOS-1D Mark II. As the popularity of this imaging method grows, several camera manufactures are now offering built-in HDR features. For example, the Pentax K-7 DSLR has an HDR mode that captures an HDR image and outputs (only) a tone mapped JPEG file. The Canon PowerShot G12, Canon PowerShot S95 and Canon PowerShot S100 offer similar features in a smaller format.. Nikon’s approach is called ‘Active D-Lighting’ which applies exposure compensation and tone mapping to the image as it comes from the sensor, with the accent being on retaing a realistic effect . Some smartphones provide HDR modes, and most mobile platforms have apps that provide HDR picture taking.
Camera characteristics such as gamma curves, sensor resolution, noise, photometric calibration and color calibration affect resulting high-dynamic-range images.
Color film negatives and slides consist of multiple film layers that respond to light differently. As a consequence, transparent originals (especially positive slides) feature a very high dynamic range
Tone mapping reduces the dynamic range, or contrast ratio, of an entire image while retaining localized contrast. Although it is a distinct operation, tone mapping is often applied to HDRI files by the same software package.
Several software applications are available on the PC, Mac and Linux platforms for producing HDR files and tone mapped images. Notable titles include
Dynamic Photo HDR
HDR Efex Pro
Information stored in high-dynamic-range images typically corresponds to the physical values of luminance or radiance that can be observed in the real world. This is different from traditional digital images, which represent colors as they should appear on a monitor or a paper print. Therefore, HDR image formats are often called scene-referred, in contrast to traditional digital images, which are device-referred or output-referred. Furthermore, traditional images are usually encoded for the human visual system (maximizing the visual information stored in the fixed number of bits), which is usually called gamma encoding or gamma correction. The values stored for HDR images are often gamma compressed (power law) or logarithmically encoded, or floating-point linear values, since fixed-point linear encodings are increasingly inefficient over higher dynamic ranges.
HDR images often don’t use fixed ranges per color channel—other than traditional images—to represent many more colors over a much wider dynamic range. For that purpose, they don’t use integer values to represent the single color channels (e.g., 0-255 in an 8 bit per pixel interval for red, green and blue) but instead use a floating point representation. Common are 16-bit (half precision) or 32-bit floating point numbers to represent HDR pixels. However, when the appropriate transfer function is used, HDR pixels for some applications can be represented with a color depth that has as few as 10–12 bits for luminance and 8 bits for chrominance without introducing any visible quantization artifacts.
History of HDR photography
The idea of using several exposures to adequately reproduce a too-extreme range of luminance was pioneered as early as the 1850s by Gustave Le Gray to render seascapes showing both the sky and the sea. Such rendering was impossible at the time using standard methods, as the luminosity range was too extreme. Le Gray used one negative for the sky, and another one with a longer exposure for the sea, and combined the two into one picture in positive.
Mid 20th century
Manual tone mapping was accomplished by dodging and burning – selectively increasing or decreasing the exposure of regions of the photograph to yield better tonality reproduction. This was effective because the dynamic range of the negative is significantly higher than would be available on the finished positive paper print when that is exposed via the negative in a uniform manner. An excellent example is the photograph Schweitzer at the Lamp by W. Eugene Smith, from his 1954 photo essay A Man of Mercy on Dr. Albert Schweitzer and his humanitarian work in French Equatorial Africa. The image took 5 days to reproduce the tonal range of the scene, which ranges from a bright lamp (relative to the scene) to a dark shadow.
Ansel Adams elevated dodging and burning to an art form. Many of his famous prints were manipulated in the darkroom with these two methods. Adams wrote a comprehensive book on producing prints called The Print, which prominently features dodging and burning, in the context of his Zone System.
With the advent of color photography, tone mapping in the darkroom was no longer possible due to the specific timing needed during the developing process of color film. Photographers looked to film manufacturers to design new film stocks with improved response, or continued to shoot in black and white to use tone mapping methods.
Color film capable of directly recording high-dynamic-range images was developed by Charles Wyckoff and EG&G "in the course of a contract with the Department of the Air Force". This XR film had three emulsion layers, an upper layer having an ASA speed rating of 400, a middle layer with an intermediate rating, and a lower layer with an ASA rating of 0.004. The film was processed in a manner similar to color films, and each layer produced a different color. The dynamic range of this extended range film has been estimated as 1:108. It has been used to photograph nuclear explosions, for astronomical photography, for spectrographic research, and for medical imaging. Wyckoff’s detailed pictures of nuclear explosions appeared on the cover of Life magazine in the mid-1950s.
Late 20th century
Georges Cornuéjols and licensees of his patents (Brdi, Hymatom) introduced the principle of HDR video image, in 1986, by interposing a matricial LCD screen in front of the camera’s image sensor, increasing the sensors dynamic by five stops. The concept of neighborhood tone mapping was applied to video cameras by a group from the Technion in Israel led by Dr. Oliver Hilsenrath and Prof. Y.Y.Zeevi who filed for a patent on this concept in 1988.
In February and April 1990, Georges Cornuéjols introduced the first real-time HDR camera that combined two images captured by a sensor3435 or simultaneously3637 by two sensors of the camera. This process is known as bracketing used for a video stream.
In 1991, the first commercial video camera was introduced that performed real-time capturing of multiple images with different exposures, and producing an HDR video image, by Hymatom, licensee of Georges Cornuéjols.
Also in 1991, Georges Cornuéjols introduced the HDR+ image principle by non-linear accumulation of images to increase the sensitivity of the camera: for low-light environments, several successive images are accumulated, thus increasing the signal to noise ratio.
In 1993, another commercial medical camera producing an HDR video image, by the Technion.
Modern HDR imaging uses a completely different approach, based on making a high-dynamic-range luminance or light map using only global image operations (across the entire image), and then tone mapping the result. Global HDR was first introduced in 19931 resulting in a mathematical theory of differently exposed pictures of the same subject matter that was published in 1995 by Steve Mann and Rosalind Picard.
On October 28, 1998, Ben Sarao created one of the first nighttime HDR+G (High Dynamic Range + Graphic image)of STS-95 on the launch pad at NASA’s Kennedy Space Center. It consisted of four film images of the shuttle at night that were digitally composited with additional digital graphic elements. The image was first exhibited at NASA Headquarters Great Hall, Washington DC in 1999 and then published in Hasselblad Forum, Issue 3 1993, Volume 35 ISSN 0282-5449.
The advent of consumer digital cameras produced a new demand for HDR imaging to improve the light response of digital camera sensors, which had a much smaller dynamic range than film. Steve Mann developed and patented the global-HDR method for producing digital images having extended dynamic range at the MIT Media Laboratory. Mann’s method involved a two-step procedure: (1) generate one floating point image array by global-only image operations (operations that affect all pixels identically, without regard to their local neighborhoods); and then (2) convert this image array, using local neighborhood processing (tone-remapping, etc.), into an HDR image. The image array generated by the first step of Mann’s process is called a lightspace image, lightspace picture, or radiance map. Another benefit of global-HDR imaging is that it provides access to the intermediate light or radiance map, which has been used for computer vision, and other image processing operations.
In 2005, Adobe Systems introduced several new features in Photoshop CS2 including Merge to HDR, 32 bit floating point image support, and HDR tone mapping.
On June 30, 2016, Microsoft added support for the digital compositing of HDR images to Windows 10 using the Universal Windows Platform.
Modern CMOS image sensors can often capture a high dynamic range from a single exposure. The wide dynamic range of the captured image is non-linearly compressed into a smaller dynamic range electronic representation. However, with proper processing, the information from a single exposure can be used to create an HDR image.
Such HDR imaging is used in extreme dynamic range applications like welding or automotive work. Some other cameras designed for use in security applications can automatically provide two or more images for each frame, with changing exposure. For example, a sensor for 30fps video will give out 60fps with the odd frames at a short exposure time and the even frames at a longer exposure time. Some of the sensor may even combine the two images on-chip so that a wider dynamic range without in-pixel compression is directly available to the user for display or processing.
In infrared photography, the film or image sensor used is sensitive to infrared light. The part of the spectrum used is referred to as near-infrared to distinguish it from far-infrared, which is the domain of thermal imaging. Wavelengths used for photography range from about 700 nm to about 900 nm. Film is usually sensitive to visible light too, so an infrared-passing filter is used; this lets infrared (IR) light pass through to the camera, but blocks all or most of the visible light spectrum (the filter thus looks black or deep red). ("Infrared filter" may refer either to this type of filter or to one that blocks infrared but passes other wavelengths.)
When these filters are used together with infrared-sensitive film or sensors, "in-camera effects" can be obtained; false-color or black-and-white images with a dreamlike or sometimes lurid appearance known as the "Wood Effect," an effect mainly caused by foliage (such as tree leaves and grass) strongly reflecting in the same way visible light is reflected from snow. There is a small contribution from chlorophyll fluorescence, but this is marginal and is not the real cause of the brightness seen in infrared photographs. The effect is named after the infrared photography pioneer Robert W. Wood, and not after the material wood, which does not strongly reflect infrared.
The other attributes of infrared photographs include very dark skies and penetration of atmospheric haze, caused by reduced Rayleigh scattering and Mie scattering, respectively, compared to visible light. The dark skies, in turn, result in less infrared light in shadows and dark reflections of those skies from water, and clouds will stand out strongly. These wavelengths also penetrate a few millimeters into skin and give a milky look to portraits, although eyes often look black.
Until the early 20th century, infrared photography was not possible because silver halide emulsions are not sensitive to longer wavelengths than that of blue light (and to a lesser extent, green light) without the addition of a dye to act as a color sensitizer. The first infrared photographs (as distinct from spectrographs) to be published appeared in the February 1910 edition of The Century Magazine and in the October 1910 edition of the Royal Photographic Society Journal to illustrate papers by Robert W. Wood, who discovered the unusual effects that now bear his name. The RPS co-ordinated events to celebrate the centenary of this event in 2010. Wood’s photographs were taken on experimental film that required very long exposures; thus, most of his work focused on landscapes. A further set of infrared landscapes taken by Wood in Italy in 1911 used plates provided for him by CEK Mees at Wratten & Wainwright. Mees also took a few infrared photographs in Portugal in 1910, which are now in the Kodak archives.
Infrared-sensitive photographic plates were developed in the United States during World War I for spectroscopic analysis, and infrared sensitizing dyes were investigated for improved haze penetration in aerial photography. After 1930, new emulsions from Kodak and other manufacturers became useful to infrared astronomy.
Infrared photography became popular with photography enthusiasts in the 1930s when suitable film was introduced commercially. The Times regularly published landscape and aerial photographs taken by their staff photographers using Ilford infrared film. By 1937 33 kinds of infrared film were available from five manufacturers including Agfa, Kodak and Ilford. Infrared movie film was also available and was used to create day-for-night effects in motion pictures, a notable example being the pseudo-night aerial sequences in the James Cagney/Bette Davis movie The Bride Came COD.
False-color infrared photography became widely practiced with the introduction of Kodak Ektachrome Infrared Aero Film and Ektachrome Infrared EIR. The first version of this, known as Kodacolor Aero-Reversal-Film, was developed by Clark and others at the Kodak for camouflage detection in the 1940s. The film became more widely available in 35mm form in the 1960s but KODAK AEROCHROME III Infrared Film 1443 has been discontinued.
Infrared photography became popular with a number of 1960s recording artists, because of the unusual results; Jimi Hendrix, Donovan, Frank and a slow shutter speed without focus compensation, however wider apertures like f/2.0 can produce sharp photos only if the lens is meticulously refocused to the infrared index mark, and only if this index mark is the correct one for the filter and film in use. However, it should be noted that diffraction effects inside a camera are greater at infrared wavelengths so that stopping down the lens too far may actually reduce sharpness.
Most apochromatic (‘APO’) lenses do not have an Infrared index mark and do not need to be refocused for the infrared spectrum because they are already optically corrected into the near-infrared spectrum. Catadioptric lenses do not often require this adjustment because their mirror containing elements do not suffer from chromatic aberration and so the overall aberration is comparably less. Catadioptric lenses do, of course, still contain lenses, and these lenses do still have a dispersive property.
Infrared black-and-white films require special development times but development is usually achieved with standard black-and-white film developers and chemicals (like D-76). Kodak HIE film has a polyester film base that is very stable but extremely easy to scratch, therefore special care must be used in the handling of Kodak HIE throughout the development and printing/scanning process to avoid damage to the film. The Kodak HIE film was sensitive to 900 nm.
As of November 2, 2007, "KODAK is preannouncing the discontinuance" of HIE Infrared 35 mm film stating the reasons that, "Demand for these products has been declining significantly in recent years, and it is no longer practical to continue to manufacture given the low volume, the age of the product formulations and the complexity of the processes involved." At the time of this notice, HIE Infrared 135-36 was available at a street price of around $12.00 a roll at US mail order outlets.
Arguably the greatest obstacle to infrared film photography has been the increasing difficulty of obtaining infrared-sensitive film. However, despite the discontinuance of HIE, other newer infrared sensitive emulsions from EFKE, ROLLEI, and ILFORD are still available, but these formulations have differing sensitivity and specifications from the venerable KODAK HIE that has been around for at least two decades. Some of these infrared films are available in 120 and larger formats as well as 35 mm, which adds flexibility to their application. With the discontinuance of Kodak HIE, Efke’s IR820 film has become the only IR film on the marketneeds update with good sensitivity beyond 750 nm, the Rollei film does extend beyond 750 nm but IR sensitivity falls off very rapidly.
Color infrared transparency films have three sensitized layers that, because of the way the dyes are coupled to these layers, reproduce infrared as red, red as green, and green as blue. All three layers are sensitive to blue so the film must be used with a yellow filter, since this will block blue light but allow the remaining colors to reach the film. The health of foliage can be determined from the relative strengths of green and infrared light reflected; this shows in color infrared as a shift from red (healthy) towards magenta (unhealthy). Early color infrared films were developed in the older E-4 process, but Kodak later manufactured a color transparency film that could be developed in standard E-6 chemistry, although more accurate results were obtained by developing using the AR-5 process. In general, color infrared does not need to be refocused to the infrared index mark on the lens.
In 2007 Kodak announced that production of the 35 mm version of their color infrared film (Ektachrome Professional Infrared/EIR) would cease as there was insufficient demand. Since 2011, all formats of color infrared film have been discontinued. Specifically, Aerochrome 1443 and SO-734.
There is no currently available digital camera that will produce the same results as Kodak color infrared film although the equivalent images can be produced by taking two exposures, one infrared and the other full-color, and combining in post-production. The color images produced by digital still cameras using infrared-pass filters are not equivalent to those produced on color infrared film. The colors result from varying amounts of infrared passing through the color filters on the photo sites, further amended by the Bayer filtering. While this makes such images unsuitable for the kind of applications for which the film was used, such as remote sensing of plant health, the resulting color tonality has proved popular artistically.
Color digital infrared, as part of full spectrum photography is gaining popularity. The ease of creating a softly colored photo with infrared characteristics has found interest among hobbyists and professionals.
In 2008, Los Angeles photographer, Dean Bennici started cutting and hand rolling Aerochrome color Infrared film. All Aerochrome medium and large format which exists today came directly from his lab. The trend in infrared photography continues to gain momentum with the success of photographer Richard Mosse and multiple users all around the world.
Digital camera sensors are inherently sensitive to infrared light, which would interfere with the normal photography by confusing the autofocus calculations or softening the image (because infrared light is focused differently from visible light), or oversaturating the red channel. Also, some clothing is transparent in the infrared, leading to unintended (at least to the manufacturer) uses of video cameras. Thus, to improve image quality and protect privacy, many digital cameras employ infrared blockers. Depending on the subject matter, infrared photography may not be practical with these cameras because the exposure times become overly long, often in the range of 30 seconds, creating noise and motion blur in the final image. However, for some subject matter the long exposure does not matter or the motion blur effects actually add to the image. Some lenses will also show a ‘hot spot’ in the centre of the image as their coatings are optimised for visible light and not for IR.
An alternative method of DSLR infrared photography is to remove the infrared blocker in front of the sensor and replace it with a filter that removes visible light. This filter is behind the mirror, so the camera can be used normally – handheld, normal shutter speeds, normal composition through the viewfinder, and focus, all work like a normal camera. Metering works but is not always accurate because of the difference between visible and infrared refraction. When the IR blocker is removed, many lenses which did display a hotspot cease to do so, and become perfectly usable for infrared photography. Additionally, because the red, green and blue micro-filters remain and have transmissions not only in their respective color but also in the infrared, enhanced infrared color may be recorded.
Since the Bayer filters in most digital cameras absorb a significant fraction of the infrared light, these cameras are sometimes not very sensitive as infrared cameras and can sometimes produce false colors in the images. An alternative approach is to use a Foveon X3 sensor, which does not have absorptive filters on it; the Sigma SD10 DSLR has a removable IR blocking filter and dust protector, which can be simply omitted or replaced by a deep red or complete visible light blocking filter. The Sigma SD14 has an IR/UV blocking filter that can be removed/installed without tools. The result is a very sensitive digital IR camera.
While it is common to use a filter that blocks almost all visible light, the wavelength sensitivity of a digital camera without internal infrared blocking is such that a variety of artistic results can be obtained with more conventional filtration. For example, a very dark neutral density filter can be used (such as the Hoya ND400) which passes a very small amount of visible light compared to the near-infrared it allows through. Wider filtration permits an SLR viewfinder to be used and also passes more varied color information to the sensor without necessarily reducing the Wood effect. Wider filtration is however likely to reduce other infrared artefacts such as haze penetration and darkened skies. This technique mirrors the methods used by infrared film photographers where black-and-white infrared film was often used with a deep red filter rather than a visually opaque one.
Another common technique with near-infrared filters is to swap blue and red channels in software (e.g. photoshop) which retains much of the characteristic ‘white foliage’ while rendering skies a glorious blue.
Several Sony cameras had the so-called Night Shot facility, which physically moves the blocking filter away from the light path, which makes the cameras very sensitive to infrared light. Soon after its development, this facility was ‘restricted’ by Sony to make it difficult for people to take photos that saw through clothing. To do this the iris is opened fully and exposure duration is limited to long times of more than 1/30 second or so. It is possible to shoot infrared but neutral density filters must be used to reduce the camera’s sensitivity and the long exposure times mean that care must be taken to avoid camera-shake artifacts.
Fuji have produced digital cameras for use in forensic criminology and medicine which have no infrared blocking filter. The first camera, designated the S3 PRO UVIR, also had extended ultraviolet sensitivity (digital sensors are usually less sensitive to UV than to IR). Optimum UV sensitivity requires special lenses, but ordinary lenses usually work well for IR. In 2007, FujiFilm introduced a new version of this camera, based on the Nikon D200/ FujiFilm S5 called the IS Pro, also able to take Nikon lenses. Fuji had earlier introduced a non-SLR infrared camera, the IS-1, a modified version of the FujiFilm FinePix S9100. Unlike the S3 PRO UVIR, the IS-1 does not offer UV sensitivity. FujiFilm restricts the sale of these cameras to professional users with their EULA specifically prohibiting "unethical photographic conduct".
Phase One digital camera backs can be ordered in an infrared modified form.
Remote sensing and thermographic cameras are sensitive to longer wavelengths of infrared (see Infrared spectrum#Commonly used sub-division scheme). They may be multispectral and use a variety of technologies which may not resemble common camera or filter designs. Cameras sensitive to longer infrared wavelengths including those used in infrared astronomy often require cooling to reduce thermally induced dark currents in the sensor (see Dark current (physics)). Lower cost uncooled thermographic digital cameras operate in the Long Wave infrared band (see Thermographic camera#Uncooled infrared detectors). These cameras are generally used for building inspection or preventative maintenance but can be used for artistic pursuits as well.
Tagged: , hdr , infrared , village , Cambridgeshire , Church , England , East Anglia , UK , United Kingdom , parish , fens , medieval , Cambs , Churches