Carlo Macchiavello

All is possible

Upgrade -> Improvement … mica always

firmware 2.4In an increasingly fast-paced, more hectic world, it seems that upgrading products is essential to work, indeed indispensable and if you stay behind it becomes a problem…

there are hardware manufacturers such as in the case of mobile that even force the upgrade to new firmware and SO by automatically downloading them on devices, taking up space, creating false system errors at the exit of the new os to "unintentionally" cause the need for reset to update the system, and often preventing it from going back, or bringing the user to need to be a super geek to go back. I personally witness the problems caused by two leading companies in the sector, that by resetting the devices to the previous system by hacker methods (because for one of the two there is no way to downgrade, indeed it is blocked by the company's server), you restore functionality perfectly, and to clean device, if you do the upgrade you block or slow down or make the product useless.

Now if you are here is to talk about cameras or cameras, already in the past I have dabbled in the hacking of the firmware of panasonic thanks to the tool made by the Russian hacker Vitality, both using presets already made including the famous Flowmotion, or creating variants on my own, to optimize the rendering of the panasonic GH2, excellent vdslr that with hacking was able to compete with cameras much higher.

Today we see that not always an upgrade equals an advantage. I have long switched to digital cameras, in particular Blackmagic Design, I have evised pros and cons in other articles, and I have expressed my thoughts, then depending on the needs and tastes may like it or not, for my needs are close to perfection, and for the cost that they far exceed the results.

Taken a year and a half ago, when there was still a very basic firmware, slowly with realase increasingly sophisticated firmware the machine has become a reliable and quality production product.

Recently having needed low light recovery, I noticed a defect that I had not noticed in the past, because there was no…

having filmed her granddaughter in the dark, or rather illuminated only by the light of the kitchen TV, so almost in the dark, I noticed a black halo on the left side very dense, and on top of a similar thing, but less intense.

Immediately I became restless, I did some tests at different levels of light, and the problem arose in the absence of light, where pumping in the post the signal (of 5 stop) of an image taken with the cap, was highlighted as per jpeg.

Sure it wasn't before, I started doing some research and I met a post right on the BMD site, related to this problem with the new firmware…

after doing some experiment, with a lot of patience (changing the firmware on the camera takes about 15 minutes each time), I found that depending on the firmware the result in low light was different.

Version 1.9.5 sfirmware 1.9.5hows 5stop – a slight form of banding, absent in the presence of light signal. More than natural on such an image.

firmware 2.0.1to a version 2.0.1 features a slightly more pronounced banding

firmware 2.4version 2.4 that introduces theoretical machine optimizations (I would be inclined to think otherwise) and only adds guides for the 1.84 and 2.40 formats to the screen, which I can use those of the external monitor, or the dear old acetate sheet with the marks on the control monitor, I can avoid at the foot.

so I conclude that to date, being always up-to-date is not good at any cost, and that you have to know how to look back, because what I would immediately trace back to a hardware defect, is actually related to the software of the machine.

Now I already imagine that the detractors of the BMC are already ready with comments of outrage, but I stop them informing them that the first Alexa I tried, machine that costs 20 (twenty) times the bmpc4k on release was in the following conditions :

  • the audio was not recorded in the room because the firmware did not consider the xlr of the camera
  • registered in prores, three types, internally only the fullHD format, and the 2k recording was only available from external recorder that would arrive 6 months later
  • every 3-4 recordings one skipped, without giving notice, so you always had to look at the shootings otherwise you risked having a lack of shooting on the scene
  • the amount of noise from the first release was monstrous, I remember the first footage and they looked like dslr.
  • it has the brand that has been a cinematic guarantee for decades, so people demanded, but they didn't complain about it on the forums, because with that machine they worked on it… From the other, the film cameras jammed, scratched the footage, etc.

so this post not to complain about the problem, of which BMD has been informed with great detail, but to provide a solution to those who have to solve on the fly, that is to go back to firmware 1.9.5

I want to remember that in life those who complain in the wind, waste time
those who are engineered and evolve, go ahead and solve 😀


Because I like prime lenses, and I still bought a couple of zooms…

Let's start from a practical optical physical assumption : fixed lenses are on average qualitatively better than the focal equivalents inside a zoom, for fewer moving lenses, sum of lenses, control of the same etc etc, and they can be brighter with a low cost anyway.

The zoom lenses on average are more "slow", less bright, but often stabilized, which in the fixtures is much more rarely found, and have the advantage of being able to offer more focal with the same lens.

If I'm in the beginning, what should I choose?

I am old school, so for the beginnings better to start from fixed lenses, so with a low cost you can buy fixed lenses, even vintage, good quality, bright, and that teach the grammar of the framing and the focal, which in the zoom all seem to become blind in front of it.

But if I want to have a zoom?

A zoom requires to have quality a discreet or a great investment, for many reasons. A good optical quality zoom costs, if you also want a bright zoom, the cost goes up by up to ten times the price of the base zoom, but you have what you pay.

Zooming means having more focal points (bright) available with a stabilizer that helps me in machine movements or if I have to do hand shots.

Why shouldn't I zoom in at first, even though I can buy it?

because there are shooting principles that you don't learn when you have a zoom!
Practical example:
I have the hat-trick of lenses 35-50-85mm with which so many masters from hitchcock to others have made entire movies, I have to make a shot and mount the 35mm, then I move with the camera back or forward to find the correct framing of the actors, I'm careful how I place the camera, how I compose the frame because being fixed the focal, what is inside and what is out of the field.
The fixed lens forces me to think, decide, not move the zoom lever at random to keep everything inside the frame.
The fixed lens leads me to decide which focal point, and therefore what kind of appearance and aesthetics I apply to the frame, because the different focal lengths alter the perspective and consequently the final image.

What does zooming do me do?

I place myself in a spot and I play with zoom, lay and squeeze, and I do stand everything I need less inside the frame, I do not think about the fact that changing the focus I am changing the aesthetics of the frame, that the perceived speed of movement of objects on the side of the screen changes, at the allege or narrow and away… zooms in and cements the camera in one position because there seems to be no need to move.focallength property

So should the zoom be avoided?

no, zooming is comfortable when you know how to choose a focal, when you have to bring less lenses with us, when you need a stabilized lens and the fixed that we have is not.
Of course a good zoom costs, if the classic zoom from kit 18-55 3.5-5.6 costs a hundred euros, a zoom brighter and more serious as the 17-55 2.8 constant costs 1200 euros… but it's worth them all, in sharpness, quality, brightness, robustness.

I'm very pragmatic, those who start shooting should not use zooms, to learn how to choose, to frame, to build framing, to always think before pressing the button, it is a great school of thought and work. Not to mention that if you think about cinema, zooming except for particular special effects is almost never used.

So why buy a zoom, especially if expensive?

  • because if I have to make a shot with a minimum of movement with a 200mm I need a serious stabilizer, otherwise even the simple breathing with the hand leaning against the knob of the follow focus is felt
  • because if I have to find the right compression of the perspective between 85 and 135 to compose a certain image the zoom helps me.
  • because if I have to work in a humid, dusty environment or otherwise with elements that could penetrate inside the camera at the lens change, zooming avoids me this
  • because if I have more focal on the fly to work (documentary, news, etc.) the bright zoom allows me to work fast without changing lenses
  • because if I have to buy all the focals that I need of brightness 2.8 pe cover the 70-200 2.8 I will spend more without finding the stabilization on all the lenses.
  • because zooming also has its advantages.

So what should you choose?

neither, or better, it would be better to own both… space, budget permitting.

Personally over the years I have made my set of prime lenses, Nikon lenses vintage Nippon Kogaku series, that I like for their particular rendering of light, a single coating that in the backlight does not completely filter the light, but just softens it, which makes the images less aseptic and with a different character from modern ones, in the most cinematic use I am led to use these lenses that cover me from 24mm to 85mm which are the classic focal points (24-35-50-85mm).
To them I join for practicality and pragmatism a pair of 2.8 constant zooms, to cover with 17-55 all the focals from the medium-pushed wide angle (17mm) to the normal (55mm), while complete the whole thing with the 70-200 IS II 2.8 constant all the other focals up to the pushed wide angle. So that where I need stabilization with the moving camera I know I have excellent lenses and stabilized more than effectively. Of course this is a set that has its value, but as for a photographer, the machines change, the lenses normally remain…


4k the old frontier of the new, waiting for the 8k TVs we won’t be able to see

While I am a proponent of 4k for filming, because it offers so much material to work with, so many advantages in any material manipulation operation, I am convinced that it is yet another 21st century hoax to sell the new TV, the new old content.

Why is 4k useless in televisions, cell phones, amateur cameras?

not to mention the fact that if it is not used for postproduction it is a waste of resources cpu of the capturing device, occupied memory space, overheating of the device, 4k on a device below certain size is a useless waste. The visual acuity of an average human being is such that it perceives about 1/10 mm at about 50cm distance, given that an average size 4k panel is a density of 110 ppi, i.e. on a 40 inch we are talking about 4 dots per square mm, too bad that a 40 inch will look from at least 150 cm where the resolving capacity has dropped to 2-3mm on average, so there are as much as ten times the information perceivable by an average human being…

this calculation is only valid if we have a pure 4k source, because if on that TV we see a 4k movie from streaming, youtube 4k, cell phone or other elements we actually won’t have all that information, because the panel offers that definition, but the transmitted data don’t have it, so the TV will be completely useless….

so why buy a 4k television today?

Let’s do a very simple and quick analysis of the pros and cons :

Pros:

  1. it is a great high-resolution digital frame for your photographs
  2. if I have a real 4k digital camera I can see the pictures in their real glory if I stand very close to the 40 inch I bought.

Cons:

  1. there is no real 4k content yet to take advantage of a 4k TV with
  2. the 4k bluray standard is still theoretical, but there are no 4k blurays on the market
  3. there are still no movies shot completely in 4k, so you would still see “bloated” and non-native 4k movies
  4. streming 4k actually offers less than 2.7 k effective resolution, the rest is born by interpolation so again useless.
  5. 4k broadcasting is theory, in reality to date there are few channels in fullHD (1920 x 1080), most at most are hd (1280×720).
  6. 4k TVs do not have digital decoders to receive 4k signals because 4k transmission standards have not yet been defined, so by the time 4k channels exist our TV will already be obsolete and unable to show movies at 4k resolution.
  7. reading FullHd movies offers a blurry view of the content (because inflated 4 times) or too sharpened because to mask the soft image is processed actually eating several levels of detail during the raw sharpening operations. So to view a bluray is a bad vehicle.
  8. it will cost much more than a quality fullHD equivalent, not offering to date the ability to view truly different images.
  9. in viewing 4k amateur images from cell phones, cameras etc. they may not have enough detail to show quality images to take advantage of the 4k matrix
  10. to perceive the real difference between a 4k TV and a Fhd TV you have to go from 50 inches up, which however you will have to see more from a distance and then you go back to the initial absurd, useless for most people who do not have the visual ability to appreciate the detail on a physiological level.

why is 4k not needed in cinema?

4k projection is a bit of a scam, because in reality most movies are shot with 2k digital cameras, such as alexa, so the fact that it is then projected in 4k gives no advantage in fact, in fact, it is an unnecessary flexing of the muscles of the whole system, because it will require 4 times the original space of 2k, more resources for play and content management without giving any real advantage.

But can you really not see the difference?

Nolan bragged that he shot The Last Dark Knight and Interstellar in Imax (ultra-high-definition film format), and a lot of people said they noticed the difference…
I’d be curious to ask those same people if they can tell me what shots they noticed the difference in, because neither of them are shot completely in Imax, too expensive, too big and uncomfortable cameras, etc etc… so traditional s35 shots were alternated with imax shots (purely of exteriors where it was easier to handle bulkier cameras)… especially since these films in most situations were seen in digital theaters on 2k projectors, so where everything was flattened and shifted down.

Another plus point for the overall mix is given by the fact that many films digitally undergo very heavy postproduction, even if only at the level of editing and color correction, one is then not able to distinguish in short takes made by dslr, go pro, cameras, from professional cameras. All thanks to the fact that they are used in the best situation to best extrapolate the visual quality from the different sensors by making them work to the best of their ability.

so for 4k is it early?

well you shoot 4k for the future, because you are extracting a very high quality 2k and fullHD, but directly using 4k at the home level is a waste, because it is not directly perceptible to the eye in most home situations.

why then are 4k televisions and all the 4k peripherals proliferating?

something has to sell you, or does it?
In marketing, numbers have always been used to give a tangible perception of a value, even if here numbers had no real connection to the value of the product.

for example, burners started with the 2x of cds, up to x52 of dvds, but no one tells you that x52 media does not exist, because burning is a balance between write speed and number of errors introduced, depending on the quality of the media the speed is dosed to introduce a minimum number of write errors, to allow the data to be read and thanks to an error correction system to be able to go back to the original data. The concept of read error correction was originally born to compensate for manufacturing defects, and/or scratches or damage to the media, over time this system has become a method of speeding up writing to the media based on the fact that in the end you can still read the data.

Where does the problem lie? In the fact that if you take a media to the limit of readability because we want to burn it at x52 instead of x8, all it takes is slight wear and tear to make the written data unreadable. Not only that, slow writing applies the write laser differently and by introducing fewer errors also makes the media more resistant to harder wear, uv exposure, deformation of the writable media, etc.
Which makes one think about how superficially one writes data to a medium, without having had notions of how to do it and how to store it.. good luck to formula 1 burners, maybe after 3-4 months they will still be able to reread something from their media.

another example, megapixels in cameras:

it has always seemed that megapixels are an indication of quality, but if you squeeze 40 megapixels onto a 1/4-inch sensor, you cannot expect to have the same cleanliness and light as 12 megapixels on a fullframe sensor, because the light captured by each receptor is greater. Actually it is not only the megapixels but also the ability of the megapixels to capture information, the area covered that offer the actual quality of the captured image, but the concept is too complicated, so for the masses megapixels = quality.

i still remember when I gave my sister a three megapixel compact camera, in a world where several 5-6 megapixel cameras had already come out, but the photographs and the detail of those photographs was unmatched by the equivalents as a price range, because some interpolated, some had yes more receptors but less sensitive etc etc.

today one competition in cameras and cameras is sensitivity (actually even 25 years ago, since we talk even then about shooting with a candle).
If you don’t shoot in the dark, and I’m not talking about natural light, I’m talking about dark, then the camera is not worthwhile…so a red and an alexa, digital cameras they use to make movies that have a native sensitivity of only 800 iso are scamorces…

Why is it already late to buy a 4k television?

let’s say mine is a provocative statement, but not too provocative….
because the Japanese are already experimenting with 8k broadcasts, so why buy an outdated product, you might as well go straight to 8k 😀

jokes aside, the Japanese have always been at the forefront with experimentation, I remember the first fullHD shooting and viewing system seen in person at SIM audio Hifi in milan in 1992, a joint experimentation between RAI and a Japanese giant Ikegami, ironically I had captured such images with my very powerful 200 line vhs and it seemed so far away as quality and power.

well before these pioneers back in 1986 Francis Ford Coppola, produced by George Lucas, made a special 4D (3d plus what is now called augmented reality) video clip using experimental HD cameras starring the great Michael Jackson in Captain EO.
This is to point out how if HD was already present as a technology in 1986, today after almost 30 years, it is still not the TV standard, so let’s consider well how far 4k can penetrate inside our homes in a couple of years.
Above all, one has to think about the fact that 4k does not only mean changing the reception systems, which are inexpensive, but changing all the airing and broadcasting systems, and for televisions it would be a monstrous investment, which I doubt they will make so quickly, since many are still at Standard definition.

Digital miracles

When shooting with a normal (amateur) camera, dslr or other low cost medium the footage is captured with a decent quality, designed to be viewed and edited as is, then to optimize the quality and the space occupied in memory a subsampling of the color is performed, so that less color information has to be recorded.

850px-Chroma_subsampling_ratios.svg

so a classic video has color recorded with 4:2:0 sampling, this means that once decoded into RGB the red channel will have much less information than the other colors; in a normal situation it will not cause any particular problems and most people will not see the problem, but …

during postproduction, saturating the colors brings out the problem causing an increase in the blocking problem, which is the display of the codec compression blocks, as you can see in the image below.

why red is a bad color for filmaker

There are colors based on the red channel that can give obvious problems, as you see in the image, ruining a shot.
Sometimes, and I stress sometimes, you can save these images by converting them in the most appropriate way, using utilities that upsample the red channel, so as to reduce those blocking effects.

There are different tools that act on these channels to reduce the defects, depending on the tools you use you can rely on different solutions :

  • Inside the RedGiant Shooter Suite the Deartifacter tool
  • The standalone 5D2RGB conversion utility that converts 4:2:2 files to Prores 4:2:0 files
  • The old HD LINK program from the Cineform Pro and Premium suites (no longer available thanks to GoPro eliminating the suite).

i personally recommend the RedGiant suite because you have more control elements, as well as many useful tools for any filmmaker.

The importance of backup

In a world where you use so many words disproportionately, and in particular the word Cloud, never before is backing up your data is key.

If something happens to your computer, your smartphone, your camera cards, your camera cards… you would lose everything … your data, your memories, your work…

I know what most people think: “So much won’t happen to me, the data is safe on my hard drive, I have a copy in the cloud…” etc…

Well… I’m going to tell you something disturbing… none of these storage systems is secure, no one guarantees you the salvation of your data and especially when you activate one of these services, when you buy a hard drive or a card, the only guarantee that you are given is that in the case of some cards, in case of data loss or loss, they return only a new card…

If you are on my site it means that we have something in common, for example do 3D animation, videos, images, photography, and therefore losing your data can be a problem not a little…

Many of you have a backup system and feel safe…

well do a couple of researches on the loss of The ToyStory 2 projects, and then come back here… you may find that no one is safe since an incredible company like Pixar risked losing Toystory 2’s designs to a trivial filesystem problem, and they have hundreds of servers and technicians who trade themselves in handling and managing backups… Now…

I have a raid, I’m safe…

I happened to hear often these words, even I was convinced, too bad that it was precisely the raid that betrayed me 10 years ago, when identical disks (because supertechnics advise to take identical discs for raids, even better that they are with consecutive serials, so they work better, say the ignorant) abandoned me at the same time, so my mirror raid made me hello… through my tears …

Then I relied on a wider raid, with 4 redundancy discs, guaranteed by the super experts, too bad that this time the damage was caused by a firmware defect on new discs, an entire pallet of hundreds with the same defect, recognized by the parent company, but so much my data had died, which made sure that within a few minutes, the head began to crash into the discs , and in a short time the damage had spread beyond the recoverability of the redundancy raid. more tears shed, about 6 years ago…

A solution?

no one can have the ultimate solution, I can only say what I use as a solution for backing up my data, three copies of the data : one local on the computer, two on external hard drives, updated on alternate days.

Each block of disks is of different brands, of different manufacturers (some brands are produced by the same manufacturers, with the same chipsets and hardware), to avoid chipset and firmware errors.

What do I use to keep my backups up to date?

under windows and under mac there are several packages to check the synchronicity of the data, to avoid updates by hand, because it is not possible to remember every single file updated from time to time.

under mac I used an app called Syncron, good until you are under HighSierra, from Mojave and upper it give a lots of troubles, AVOID IT; under windows I use the AllWaysinch program. For both OS an interesting solution is FreeFile Sinc.

Both have automation systems to synchronize multiple folders both on-premises, network or cloud.

Edit 2020: Synkron seems to have not been updated by the author and does not work properly under Catalina, I suggest another interesting free product for Windows, MacOsX and Linux that is called FreeSync

Page 13 of 18

Powered by WordPress & Theme by Anders Norén

error: Content is protected !!