Книга: Mastering Blender
Назад: Chapter 3: Sculpting and Retopo Workflow
Дальше: Part II: Physics and Simulations

Chapter 4

Rendering and Render Engines

The ultimate goal of much 3D CG is to render the data to a 2D image or sequence of images. Blender offers a variety of options for rendering your work, both for real-time rendering for the viewport and game engine and for using more resource-intensive algorithms for exporting complex finished renders to images or movies. The standard Blender Internal renderer is a fast and powerful renderer that gives good results quickly enough to be well suited to animation and is tightly integrated into Blender’s own node-based compositor workflow. The Cycles render engine is a physically based, interactive renderer capable of a high degree of realism, which is rapidly emerging as a full replacement for the Blender Internal renderer.

In this chapter, you will learn to

What Is Rendering?

Rendering is the process by which the computer takes data about a scene and uses that data to create an image. 3D scene data includes the positioning and geometry of objects in the scene, material and texture properties, the viewpoint from which the render must be calculated, and many other factors that have an influence on the final appearance of the image. The process of rendering may include a variety of algorithms to produce different effects in a single image. Diffuse surface shading, specular highlights, cast shadows, ambient occlusion, reflections, and refraction through transparent objects may all be computed in different ways.

Rendering algorithms vary in their complexity, speed, quality, and flexibility. Using different options when you render can make a big difference in how fast your scene renders and how much memory and processing power it requires.

When people talk about rendering with respect to Blender, they are most often talking about what happens when you click the Image or Animation button on the Render properties panel or when you press the F12 key on your keyboard to produce a final render of your scene to be exported to a separate image or movie file. This is known as offline rendering (as distinct from real-time rendering). Offline rendering can take an arbitrary amount of time and require an arbitrary amount of computing resources. The priority for offline rendering is generally render quality. Offline rendering in Blender depends on the render engine you’ve chosen to render with. The default render engine is known as the Blender Internal renderer, which has a wide variety of settings that influence the quality of the finished image and determine how much computing power it requires to produce the image.

However, the truth is that rendering actually happens any time Blender draws a picture of the 3D scene, which of course occurs many times per second in the 3D viewport whenever you work with a scene in Blender. This kind of rendering must be done in real time and consume few enough resources that the user can still work comfortably.

Real-Time and Viewport Rendering

Any time you work in Blender with a 3D viewport open, real-time rendering of the contents of the 3D viewport is happening. You have several options for viewing the real-time content, which are accessed through the Viewport Shading menu in the header of the 3D viewport, shown in . By default, the options include Bounding Box, Wireframe, Solid, and Texture. The Bounding Box option displays only the bounding box information for the objects and can only be used to roughly identify the locations and sizes of objects relative to each other. Wireframe mode draws objects as transparent wireframes. This shows the shapes and geometry of objects but gives no information about opacity, color, textures, or other surface qualities. Solid mode gives a view of the opaque surfaces of objects, with limited information about their materials. For example, the basic color of the material is shown, but its reflective characteristics are not accurately rendered.

Viewport Shading menu

c04f001.tif

The rendering style of Solid and Texture view modes depends also on the viewport’s Material Mode, which is controlled in the Display panel of the Properties shelf, shown in . By default, Textured Solid is not selected, so the Solid view does not show textures.

Viewport Material Mode menu

c04f002.tif

The default Material Mode is Multitexture. In Multitexture mode, the Texture view displays the object with the currently active UV texture mapped onto its surface, as shown in . Note that the image texture shown must be active, as indicated in the figure by its being displayed in the UV/Image Editor window. This display of the texture does not depend on the texture being connected to a material on the object. In fact, the object’s material, which would be rendered by the Blender Internal renderer, may be textured completely differently. Textured view also shows an approximation of the actual scene lighting, although this is only a rough approximation and does not include many surface characteristics of the material.

Textured view with Multitexture Material Mode and an active texture

c04f003.tif

With Multitexture Material Mode selected and the Textured Solid option checked, the Solid view also displays the active texture, as shown in . This view does not represent any scene lighting characteristics but rather uses the standard Solid view lighting setup, which can be adjusted in your user preferences.

Textured Solid view

c04f004.tif

If your graphics card is able to handle the OpenGL shading language GLSL, you can obtain a more accurate textured preview by selecting GLSL Material Mode in the Display panel. Unlike Multitexture display, GLSL Texture view shows UV textures that are associated with actual materials on the object, as shown in . In this view, scene lighting is represented, and material properties such as specularity are more accurately represented than in Multitexture mode.

Textured view with GLSL mode and a textured material

c04f005.tif

Rendering with Blender Internal

Once you’ve created a scene that you want to render to a final image, you will select Image or Animation on the Render properties panel, shown in , depending on whether you want a single frame rendered or a sequence (or movie). The Dimensions and Output tabs, shown in , are where you set parameters connected with the dimensions of the final image and where you choose the output directory and format of your renders.

Render properties panel

c04f006.tif

World Properties

The World properties panel, shown in , includes a number of important settings for your render. Ambient Occlusion (AO) is a geometry-based method of simulating occluded (darkened) areas in a scene due to their being recessed or concealed by other scene geometry. In real life, nooks and crannies are darker than open areas because less light can get to them. Ambient Occlusion simulates this effect. Environment Lighting works with AO to provide the illumination that the AO will influence. You can change the energy level of the environment lighting and the color. The default is White, but if you set this to use a texture, you can get a roughly accurate image-based lighting result (see the following section on the Cycles render engine for more accurate image-based lighting).

Dimensions and Output properties

c04f007.tif

World properties

c04f008.tif

Activating Indirect Lighting enables materials with Emit values to contribute to lighting the scene. In order for this to work, Approximate must be selected as the Gather method, rather than Raytrace.

All of these effects contribute to the final render, but all of them can be isolated as well, using render passes.

Render Layers and Render Passes

For simple scenes, you can simply click the Image or Animation button and render the scene as is. However, for more complex scenes, especially ones intended to be used as input for Blender’s node-based compositing system, which you’ll begin to read about in Chapter 9, “Compositing with Nodes,” it is necessary to be able to separate components of your scene to be handled individually by the compositor. The two main tools for doing this are render layers and render passes. Properties for both render layers and render passes are found in the Layers panel in the Render properties window, shown in .

Render Layers properties

c04f009.tif

Render layers have many uses, which will become much clearer when you begin to work with the node-based compositing system. The most basic use of render layers is to divide objects in a scene for separate rendering, based on the Scene layer where the objects are found. shows a simple scene of a cube and a sphere, with the cube placed on layer 1 and the sphere placed on layer 2.

A simple scene with objects placed on layer 1 and layer 2

c04f010.tif

In , you can see the settings for two render layers, called RenderLayer1 and RenderLayer2. Under the label Scene is a set of boxes showing the visible scene layers (this is exactly the same information that is shown in the 3D viewport header). This determines what scene layers will contribute to the final render. Under the Layer label, the scene layer(s) that will be rendered by each individual render layer are selected.

Render Layers settings

c04f011.tif

and show the finished renders of RenderLayer1 and RenderLayer2, respectively. Note that although RenderLayer1 only includes the cube and RenderLayer2 only includes the sphere, both of them include shadow information from the other object. This is because both scene layers contributed to the final render.

RenderLayer1 rendered

c04f012.tif

RenderLayer2 rendered

c04f013.tif

Render layers in themselves are useful, but they become far more powerful when coupled with render passes. Render passes represent another way to separate information to use in compositing. However, render passes separate information based on the kind of visual information it is. In you can see some examples of render passes. Shown here are passes for color, ambient occlusion, diffuse shading, specular highlights, depth, environment lighting, emitted light, shadows, indirect lighting, and reflections.

Render passes

c04f014a.tif c04f014b.tif c04f014c.tif c04f014d.tif

If you choose Multilayer EXR as the output format for your render, the files you render to will also contain all the render pass information to be used later in compositing. For this reason, Multilayer EXR is an extremely convenient format. A disadvantage of working with Multilayer EXR is that it is more difficult to preview and is not supported directly in many 2D editors such as GIMP.

As with render layers, render passes are primarily useful in compositing. You’ll learn much more about render passes and how they are used in Chapter 9.

Rendering with Cycles

As of Blender version 2.61, a new option for rendering has been introduced, the Cycles render engine. Cycles makes use of an improved, physically based shader and material system and is intended to produce realistic renders while still offering a substantial amount of control to artists. At the time of this writing, Cycles is not a complete replacement for Blender Internal. It still lacks a number of features of Blender Internal, including subsurface scattering, support for particles and strand rendering, and volumetric shaders. Some shading “cheats” that are possible in Blender Internal are not available in Cycles. For what it can do, however, Cycles is a great improvement over Blender Internal and can achieve much more realistic renders.

Cycles and Sampling

Cycles works by following the paths that light would take as it passes through a scene and ends up on a pixel in the finished image. As light bounces around a scene, its energy level changes based on the reflective, refractive, and absorptive properties of the objects it hits. For example, when light bounces off a green surface, it takes on a green hue, which affects the way it contributes to the lighting of other objects. This interplay between reflected light and objects makes Cycles renders appear much more realistic than renders made with Blender’s Internal renderer.

To use Cycles as your renderer, you must select it as the render engine in the Info window header, as shown in . When you’ve done this, clicking Image or Animation on the Render properties panel will create a Cycles render as output. In addition, a new option will appear in the Viewport Shading menu in the 3D viewport header, called Rendered, as shown in . When you select this, Cycles will render your scene in real time in the 3D viewport. Most modern video cards will be able to handle this, but faster video cards will help speed it up a lot.

Selecting Cycles

c04f015.tif

Rendered view option

c04f016.tif

In a real-life scene, light travels in the form of many photons. In a CG-rendered scene, the number of photons that need to be approximated depends in part on how many pixels large the final render needs to be. Cycles uses a random-sampling algorithm to choose which light paths to approximate for the final render. As the number of samples increases, the clarity of the image also increases. The number of samples Cycles renders can be set on the Integrator panel, shown in . The default for both Render and Preview sample values is 10, but you will probably want to raise this value, especially for rendering, in order to get acceptable quality. Depending on your scene, hundreds or even thousands of samples may be necessary to completely eliminate random noise. In you can see an example of three renders of the same scene with 1, 10, and 100 samples.

Nodes Revisited

If you’ve gone on ahead here and tried to render a scene in Cycles that was originally made to be rendered in Blender Internal, you’ve probably noticed that things don’t come out quite as you might have expected. This is because Cycles uses a completely separate, fully node-based material system.

Integrator properties

c04f017.tif

Rendered with 1, 10, and 100 samples

c04f018a.tif c04f018b.tif

To demonstrate the Cycles material system, I added a little bit of geometry to the default cube to make it a slightly more interesting subject to render. You can find the .blend file among the downloadable files for this book, or you can model your own simple shape.

Starting from the default state of no materials at all, a Cycles render looks like .

A Cycles render with no material

c04f019.tif

To set up a material for the object, open a Node Editor window and, with the object selected, click the Object and Material buttons on the Node Editor header and check the Use Nodes box, as shown in . Immediately, you should see a Diffuse BSDF node and a Material Output node appear, just as in the figure. Your object now has a material. You can change the color of the material by clicking the Color field and using the color picker.

A Cycles Material node

c04f020.tif

Image-Based Lighting in Cycles

Next, we’ll turn to lighting. Cycles has many options for lighting, and it can use most of the Blender lamp types already. However, Cycles has a great advantage of being able to use high dynamic range (HDR) images directly as light sources. HDR images can provide complete lighting in Cycles, including cast shadows. Image-based lighting is exceedingly useful for CG effects that are meant to be composited into live-action video.

Setting up image-based lighting is very simple. Simply click the World icon on the Node Editor header to get to the node setup shown in , including a Background node and a World Output node. By default, the Background node is set to a slate-gray color. This needs to change.

A Background node

c04f021.tif

To change the Background color, add a Texture node by pressing Shift+A and choosing Texture > Environment Texture, as shown in .

Adding an Environment Texture node

c04f022.tif

In the Image menu of the Environment Texture node, open your HDR image from your computer, as shown in . The HDR image should be in rectilinear (panoramic) format. You can use the file BI-Rectilinear.hdr from the files that accompany this book.

An HDRI Environment Texture node

c04f023.tif

That’s all there is to it. Switch the viewport shading mode to Rendered to see the object lit by the texture, as shown in .

Node-Based Material Textures

You can do a lot with Cycles material nodes. In this section, I’ll walk through the steps of using Cycles material nodes to set up a simple grunge map for a metallic material. Grunge mapping, like the related technique of dirt mapping, means adding patterns of corrosion, rust, dirt, or wear to textures in a way that suits the shape of the textured object. This can be done by hand or by using a variety of techniques to get appropriate patterns. Ambient occlusion is a common intermediary tool for creating grunge maps, because it calculates values on an object based on the object’s recessed and protruding geometry, which often roughly matches where you would expect to find grungy buildups and buffed clean areas on an object.

You can follow along by using the texture images you will find on the website for this book. The copper textures are shown in . The first one is a relatively clean, reddish copper; the second is mostly bluish corrosion. The copper textures were created by modifying textures from , and they are made available with permission from the copyright owner. Other textures from that website are available free of charge for use in commercial and noncommercial projects, with the stipulation that the textures themselves not be redistributed.

The object lit by the image

c04f024.tif

Two copper textures

c04f025.tif

The third image we’ll be using is an AO pass render, shown in . This has been baked in advance and is already mapped to the object in the sample file.

An AO pass rendered as an image

c04f026.tif

The basic grunge map is simply a matter of mixing the two metal textures using the AO pass as a factor. shows the node setup to do this. The Texture Coordinate node ensures that the objects are mapped by UV coordinates (UV mapping has been set up in advance). The AO pass image texture is sent through a node to adjust brightness and contrast, enabling more control over the levels of mixing, and then plugs into the factor socket of the Color Mix node. The output of the Color Mix node gives the diffuse shader its color. The resulting rendered object is shown in .

We can take this a bit further. Rather than combining the textures as mere color inputs for a single diffuse shader, we can use the color for both a diffuse shader and a glossy (reflective) shader. This would be more accurate for metal, because metal is generally not a perfectly matte surface. There is usually quite a bit of specularity and reflectivity to metal.

On what basis should we mix the diffuse and glossy shaders? Since we’re working with a copper texture, we might consider trying to isolate the red levels in the surface. Redder parts of copper are generally shinier than the bluer parts.

Combining image textures with nodes

c04f027.tif

Object rendered with AO-based grunge map

c04f028.tif

To do this, you’ll split up the RGB values from the color mix node by choosing Converter > Separate RGB from the Add menu (Shift+A), as shown in .

Separating color channels

c04f029.tif

You’ll need to do a little math on the RGB channels to get the relationship you want. In short, the idea is to compare the red levels of the image to the non-red levels, which can be taken to mean the average of the blue and green levels. You can see the full node setup for this in . The math nodes are accessible from Converter > Math in the Add menu. The G and B channels feed into an Add node, whose output feeds into a Divide node with a second value of 2 (to divide the G and B sum by 2). This output is subtracted from the red value using a Subtract node. Once again, a Brightness/Contrast node is used to adjust the mixing levels. The output of this node is then sent to the Factor socket of a Mix Shader node, which has the original diffuse shader and a new glossy shader as its inputs (both of them with the same texture color input).

Associating reflectivity with the red channel

c04f030.tif

The resulting textured object renders as shown in . Although it’s difficult to see in a still image, the redder areas are more reflective than the less red areas. Of course, this is not physically perfectly accurate. The red channel of the texture is not necessarily indicative of where the reflective areas should be in reality. However, it’s a decent approximation, and it’s very easy to extract from the texture.

Redder areas are shinier.

c04f031.tif

You’ll also notice that the only component of this material that really depends on the underlying object’s structure is the AO map. Wouldn’t it be nice if you could reuse the rest of this node setup for other objects with similar-looking materials and grunge patterns? It’s possible to do this using node groups, which function as ordinary reusable Blender datablocks.

To create a reusable datablock for this node setup, you need to first separate the AO pass texture component, which must be unique to individual objects (assuming they have distinct shapes and surface characteristics). For this reason, the Texture Coordinate node must be duplicated so that it is not shared by the AO Texture node. When you’ve done this, you can simply select all the other nodes except for the Material Output node, as shown in , and group them with Ctrl+G, resulting in the NodeGroup node shown in .

Selecting the node group

c04f032.tif

Grouping the nodes

c04f033.tif

This NodeGroup node can now be used in any other material and even appended or linked into other .blend files. You can edit the node group by pressing Tab to enter Edit mode, as shown in .

Editing a node group

c04f034.tif

Rendering with Renderfarm.fi

Realistic rendering of 3D content is a computationally expensive process. Cycles has been designed to make the most of your graphics processor (GPU) in addition to your CPU, making it a comparatively efficient way to render. However, if you are working on an ordinary consumer-grade computer, you’ll find that even moderately complex scenes can take a pretty long time to render cleanly (during which time your computer will probably be close to unusable). Cycles can require sample rates in the thousands to fully eliminate noise. If your interest is animation, this render time can be prohibitive.

Although rendering finished frames is resource intensive, it generally doesn’t require real-time interaction with the user. This and other qualities make it a task well suited to distributed processing. For the most part, frames needn’t be rendered in order, and even a single frame can be separated into multiple smaller render tasks over multiple computers. Ten different Cycles renders of 10 samples apiece using different random seeds can be combined to create a single render of quality equivalent to 100 samples.

Distributing the task of rendering animation stills over many processors is standard. A collection of processors used for rendering is called a render farm. All animation studios use render farms for final rendering. Often, these render farms are in the form of gigantic clusters of Linux machines that can fill up a warehouse and require enough electricity to power a small town.

Subscription-based render farms exist. ResPower () is a well-known subscription-based render farm that supports Blender and offers very affordable rendering services for Blender users with some limitations on render time per frame.

Distributed Rendering for the People

Renderfarm.fi takes advantage of the distributable nature of rendering and the strong sense of community among Blender users to offer an extraordinary free distributed rendering solution.

Renderfarm.fi uses the Berkeley Open Infrastructure for Network Computing (BOINC) software to enable volunteers to donate their own computer’s unused processor time to the project, thus creating a potentially huge, worldwide, community-driven render farm that can be used by anybody.

There are a few restrictions on the kind of content that is suitable for rendering on Renderfarm.fi. The main one is that the .blend files and resulting renders must be licensed under a Creative Commons open license. This is partly out of principle, in the spirit of openness and community, but there is also a practical reason for it. The renders are being carried out on strangers’ machines, after all. If you want to keep your work secret, or if you’re doing professional work for clients, a paid service is probably more suitable. But for artists, hobbyists, or anybody working on personal projects, Renderfarm.fi offers an unbelievably cool and powerful service, and in fact it may be the only affordable way to render certain content for independent projects in a reasonable timeframe. There are a few other technical restrictions that are currently being worked out. Composite nodes and Python scripts are not yet supported.

Finally, the service is intended for animations, not still frames. You will probably not get much of a speed-up in rendering still frames anyway. There is some overhead in queuing, uploading, and downloading, which means that quick rendering projects do not benefit from using the service. As a rule of thumb, if a frame takes less than 20 seconds to render, you will not benefit from using Renderfarm.fi. On the other hand, if you’re rendering animations with 4K resolution frames in Cycles at 2000 samples, as I did recently, you’ll find a massive increase using Renderfarm.fi rather than your own resources.

Using the Add-on

Using Renderfarm.fi from within Blender couldn’t be easier. The first thing you need to do is to register an account on the website, at . Once you’ve done this, you can begin rendering your work.

1. To render on Renderfarm.fi, you must activate the Renderfarm.fi add-on in the Render add-ons, as shown in .
2. When the add-on has been activated, you’ll see a Renderfarm.fi option in the Render Engine drop-down in the Info header, as shown in . Select this as your renderer.

The Renderfarm.fi add-on

c04f035.tif

Choosing the Renderfarm.fi render option

c04f036.tif
3. When you’ve set Renderfarm.fi as the renderer, the Render properties area becomes the Renderfarm.fi interface. You can log in to your Renderfarm.fi account here, and then you’ll be able to fill in the details of your next render job, as shown in . As you can see, your past jobs are also listed here, so you can keep up with your Renderfarm.fi history directly through Blender.
4. Set the name of your job in the Title field. Optionally, you can set a description, tags, and a URL for the job as well.
5. Select Blender Internal or Cycles as your render engine. Renderfarm.fi supports both. Set the X and Y pixel dimensions, the start and end frames for the animation, and the frame rate in frames-per-second (FPS).
6. Set the memory usage to something that will safely handle your job. Don’t set this too high, though, because it will restrict the number of volunteer computers that can handle your frames.
7. Finally, set the licenses you want to use for the scene and the resulting render. These must be some variation of the Creative Commons license. CC by-nc-nd is the most restrictive. This requires any use of your work to be attributed to you, noncommercial, and nonderivative.
8. When this is all done, make sure your Internet connection is up, click the Render On Renderfarm.fi button, and get yourself a cup of coffee.

The Renderfarm.fi interface

c04f037.tif

For large scenes, the process of packaging and uploading your file can take time, during which Blender can seem to be frozen. Be patient. If all goes smoothly, your job will be submitted to the queue. If things don’t go smoothly, don’t hesitate to report your problems to the Renderfarm.fi developers. Bug reports are always welcome.

Monitoring Your Progress

You can see how things are going from your account in Renderfarm.fi. Each job you’ve submitted will show up on your account page, and its place in the queue for rendering will be shown on the Status page of the website. Jobs are briefly inspected by hand before being sent to the render farm; depending on how busy the site administrators are, your job could spend some time in the queue before being sent to the farm. This is all factored into the total time on Renderfarm.fi reported on the job page. shows the graph for CPU use for a nearly completed render job consisting of 150 frames 4096 by 2304 pixels in size, rendered at 2000 samples with Cycles.

Notice that the actual rendering time of 5 days 19 hours is shorter than the total time spent on Renderfarm.fi by about 8 hours. This is the amount of time this job spent on the queue. The top bar of the graph, in light orange, indicates an estimated amount of time the job would have taken on a single CPU core. If this is shorter than the second bar, then you aren’t getting any improvement by rendering on the render farm (although you might be freeing up your own computer to use for other things). In this example, though, you can see that the job is moving nearly 16 times faster than it would have on a single CPU.

Renderfarm.fiCPU use

c04f038.tif

Contributing Resources

If you use Renderfarm.fi, or even if you just want to help out by donating some unused processing time to the project, you can contribute by downloading and installing the BOINC client from . When you run BOINC, you’ll be asked to enter a project URL. Enter , and the rest will be handled automatically. BOINC will contribute processing power to the project only when you aren’t using the computer yourself, and you can set it to follow a time schedule or not to activate when the computer is running on battery power, among other settings.

The Bottom Line

Understand rendering options and render passes. You can obtain a variety of 2D effects using plane-mapped textures, shadeless materials, toon shaders, and the edge feature in Blender’s Internal renderer.
Master It Use the Toon Shader option and the Edge Render option (you’ll find them under Post-Processing in the Render properties area) to re-create the scene from scene.blend as a toon-styled image. If you’re especially interested in toon or non-photo-real rendering, or you just want a challenge, go to and download an experimental build of Blender with the Freestyle line renderer integrated. Google for some tutorials to get started, and see what you can come up with using that renderer.
Work with the Cycles rendering engine. Cycles has a variety of shader settings, which can be combined in a nearly unlimited number of ways, yielding a wide variety of material effects.
Master It Using texture nodes and differently colored glass shaders, create a convincing glass marble. Use either the HDR image from the book files for lighting, or find another panoramic HDR image online and use that. More information on the science of HDR can be found at .
Take advantage of distributed community-based rendering on Renderfarm.fi for heavy jobs. Renderfarm.fi is a distributed rendering resource for animations, which can divide your rendering tasks among many volunteers throughout the world, making for much quicker total render times.
Master It Render 100 frames’ worth of the scene in scene.blend on the render farm. Try it using both the Cycles version and the Blender Internal version, and render it in at least three different sizes. Figure out which sizes and which sample values are too low to see an improvement in render times.
Назад: Chapter 3: Sculpting and Retopo Workflow
Дальше: Part II: Physics and Simulations

lookforrent
Буду знать! Оцените туристический портал lookfor.rent
JbnvJinge
12 month loans cash now cash loans in winchester tn cash advance in dubai
androidinfoSa
Знать достаточно свежие публикации у сфере планшетов и наблюдать презентации планшетов Андроид пользователи смогут на разработанном сайте запись телефонных звонков , который окажет помощь для Вас находиться в теме последних выпусков мировых марок в операционке Android и продажи задекларированной устройств. Популярный ресурс выдает потребителям совершенно популярные предметы обсуждения: мнение экспертов про телефоны, оценка пользователей, обновление, апки для персональному смартфону, ОС Андроид, ответы на популярные вопросы также различные основные содержание, какими интересуются регулярно. Стоит коротко увидеть новый телефон и выделить уникальные характеристики? Вовсе не AndroidInfo.Ru преград - у основной строчке возможно кликнуть модель либо ключевое слово затем одержать с вашего задания подходящую параграф совместно с фотоотчетом плюс описанием преобладающего функций. В случае если юзер есть несомненного ценителя выпусков смарт устройств по операционке Андроид Android , здесь регистрация поможет юзерам ни разу не выпустить каждую единую добавленную новость у области умных систем. Будет изобилие всего увлекательного также развивающего для всем ценителей инноваций новой эры.
Anciwhish
buying paper custom written papers
DbgvAmurn
dissertation research research methodology dissertation